Comment by satvikpendem
Comment by satvikpendem 3 hours ago
The latest Kimi model is comparable in performance at least for these sorts of use cases, but yes it is harder to use locally.
Comment by satvikpendem 3 hours ago
The latest Kimi model is comparable in performance at least for these sorts of use cases, but yes it is harder to use locally.
> harder to use locally
Which means most people must be using OpenClaw connected to Claude or ChatGPT.