gaws 2 hours ago

What privacy? If you're using ChatGPT or Claude, your chats are still logged.

  • satvikpendem 2 hours ago

    It's local, meaning it uses local models, what they said in the sentence prior to the privacy one.

    • lxgr an hour ago

      Unless you have unusually powerful hardware, local models will unfortunately currently not really cut it for Moltbot.

      • satvikpendem an hour ago

        OP implied they have powerful enough hardware, since Kimi runs on their computer, so that is why they mentioned it is local. That it doesn't work for most people has no relation to what OP of this thread said. Regardless, you don't need an Opus level model, you can use a smaller one that'll just be slower at getting back to you, it's all asynchronous anyway compared to a coding agent where some level of synchronicity is expected.

        • lxgr an hour ago

          GGP seem to be under the misapprehension that privacy is a core aspect/advantage of OpenClaw, when for most users it's really not.

          So yes, I think the majority user experience is very relevant.

    • jckahn 2 hours ago

      From what I've read, OpenClaw only truly works well with Opus 4.5.

      • satvikpendem 2 hours ago

        The latest Kimi model is comparable in performance at least for these sorts of use cases, but yes it is harder to use locally.

        • gaws an hour ago

          > harder to use locally

          Which means most people must be using OpenClaw connected to Claude or ChatGPT.

lxgr an hour ago

It's the other way around. At least for most people, it grants access to your personal data to an LLM (and by extension its inference provider) in the cloud.