4/21/2025 at 8:48:49 PM
This is very convenient and nice! But I could not get it to work with the best small models available for Ollama for programming, like https://ollama.com/MFDoom/deepseek-coder-v2-tool-calling for example.by xyproto
4/21/2025 at 9:01:31 PM
Thanks so much!Was the model too big to run locally?
That’s one of the reasons I went with phi-4-mini - surprisingly high quality for its size and speed. It handled multi-step reasoning, math, structured data extraction, and code pretty well, all on modest hardware. Phi-1.5 / Phi-2 (quantized versions) also run on raspberry pi as others have demonstrated.
by codingmoh
4/22/2025 at 7:09:10 AM
The models work fine with "ollama run" locally.When trying out "phi4" locally with:
open-codex --provider ollama --full-auto --project-doc README.md --model phi4:latest
I get this error:
OpenAI rejected the request. Error details: Status: 400, Code: unknown, Type: api_error, Message: 400
registry.ollama.ai/library/phi4:latest does not support tools. Please verify your settings and try again.
by xyproto
4/21/2025 at 8:58:58 PM
That's a really old model now. Even the old Qwen 2.5 coder 32b model is better than DSv2by smcleod
4/21/2025 at 9:02:00 PM
I want to add support for qwen 2.5 nextby codingmoh
4/21/2025 at 9:20:54 PM
QwQ-32 might be worth looking into also, as a high level planning tool.by manmal
4/21/2025 at 9:30:11 PM
Thank you so much!by codingmoh
4/22/2025 at 7:24:21 AM
Hopefully Qwen 3 and maybe if we're lucky Qwen 3 Coder might be out this week too.by smcleod
4/22/2025 at 7:26:54 AM
Also GLM 4 is pretty amazing - https://www.reddit.com/r/LocalLLaMA/comments/1k4w9p2/i_uploa...by smcleod
4/22/2025 at 10:29:21 PM
Thanks, I'll have a lookby codingmoh