Comment by someguyiguess
Comment by someguyiguess 7 hours ago
I don’t get it. You can do that with the Claude app or ChatGPT too. What’s the value add?
Edit: oh I see. It’s local. So privacy. Quite a good value add actually.
Comment by someguyiguess 7 hours ago
I don’t get it. You can do that with the Claude app or ChatGPT too. What’s the value add?
Edit: oh I see. It’s local. So privacy. Quite a good value add actually.
It's local, meaning it uses local models, what they said in the sentence prior to the privacy one.
OP implied they have powerful enough hardware, since Kimi runs on their computer, so that is why they mentioned it is local. That it doesn't work for most people has no relation to what OP of this thread said. Regardless, you don't need an Opus level model, you can use a smaller one that'll just be slower at getting back to you, it's all asynchronous anyway compared to a coding agent where some level of synchronicity is expected.
The latest Kimi model is comparable in performance at least for these sorts of use cases, but yes it is harder to use locally.
What privacy? If you're using ChatGPT or Claude, your chats are still logged.