Reuses OpenAIProvider via Ollama's OpenAI-compatible API at localhost:11434.
No API key needed - just install Ollama, pull a model, and set LLM_PROVIDER=ollama.
Vision models (llava, llama3.2-vision) supported for screenshot fallback.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Removes the unnecessary nesting — all source, config, and docs now live
at the project root for simpler paths and commands.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>