Show HN: Fixthisbug.de – AI Code Fixes with Local LLM for Privacy
fixthisbug.deFor fun (and learning) I've let an AI built FixThisBug.de, an AI-selfpowered code fixing tool that's different from typical AI coding assistants as it uses a local Ollama instance with the qwen2.5 model.
Key Features:
- AI build: build entirely with AI, over several iterations via v0 and cursor.
- Local Processing: Uses qwen2.5 model through Ollama for code analysis and fixes
- Dual Language: Full support for English and German interfaces
- "Free Tier": Limited daily fixes for casual users
- "Pro Access": Unlimited fixes for authenticated users (just for fun, its free :) )
- Privacy Compliant: GDPR-compliant & hosted in Germany
Tech Stack:
- Frontend: Next.js 15.0, React 19
- Backend: Supabase + Next.js API Routes -
- AI: Local Ollama instance running qwen2.5
- Auth: Supabase Auth with HCaptcha
- Infrastructure: German servers for EU compliance
To add a little extra to the project: Unlike GitHub Copilot or similar tools, FixThisBug.de processes everything locally on the server. No external AI service is used.
I'm particularly interested in feedback on:
1. The local LLM integration approach
2. Performance optimizations for Ollama
3. Your User experience using the service
4. Privacy considerations
Would love to hear your thoughts and suggestions!