InsideOutSanta 2 days ago

There's no comparison. GLM 4.7 is fine and reasonably competent at writing code, but K2.5 is right up there with something like Sonnet 4.5. it's the first time I can use an open-source model and not immediately tell the difference between it and top-end models from Anthropic and OpenAI.

  • [removed] a day ago
    [deleted]
Alifatisk a day ago

Kimi k2.5 is a beast, speaks very human like (k2 was also good at this) and completes whatever I throw at it. However, the glm quarterly coding plan is too good of a deal. The Christmas deal ends today, so I’d still suggest to stick to it. There will always come a better model.

zeroxfe 2 days ago

It's waaay better than GLM 4.7 (which was the open model I was using earlier)! Kimi was able to quickly and smoothly finish some very complex tasks that GLM completely choked at.

cmrdporcupine 2 days ago

From what people say, it's better than GLM 4.7 (and I guess DeepSeek 3.2)

But it's also like... 10x the price per output token on any of the providers I've looked at.

I don't feel it's 10x the value. It's still much cheaper than paying by the token for Sonnet or Opus, but if you have a subscribed plan from the Big 3 (OpenAI, Anthropic, Google) it's much better value for $$.

Comes down to ethical or openness reasons to use it I guess.

  • esafak 2 days ago

    Exactly. For the price it has to beat Claude and GPT, unless you have budget for both. I just let GLM solve whatever it can and reserve my Claude budget for the rest.

akudha 2 days ago

Is the Lite plan enough for your projects?

  • BeetleB 2 days ago

    Very much so. I'm using it for small personal stuff on my home PC. Nothing grand. Not having to worry about token usage has been great (previously was paying per API use).

    I haven't stress tested it with anything large. Both at work and home, I don't give much free rein to the AI (e.g. I examine and approve all code changes).

    Lite plan doesn't have vision, so you cannot copy/paste an image there. But I can always switch models when I need to.