Comment by mark_l_watson

Comment by mark_l_watson 6 days ago

0 replies

I am retired now, out of the game, but I also suggest an alternative: running locally with open-codex, Ollama, and the qwen3 models and gemma3, and when necessary use something hosted like Gemini 2.5 Pro without an IDE.

I like to strike a balance between coding from scratch and using AI.