Comment by nashashmi

Comment by nashashmi a day ago

1 reply

I don't understand why Apple cannot implement their own LLM at the user phone level for easy pickings? like settings control? or app-specific shortcuts? or local data searching?

I understand other things like image recognition, wikipedia information, etc require external data sets, and transferring over local data to that end can be a privacy breach. But the local stuff should be easy, at least in one or two languages.

coder543 a day ago

All signs are that they are doing exactly that. They already have an on-device LLM which powers certain features, and I expect they will have a better-trained version of that on-device model that comes out with the "new Siri" update.

In the original announcement of the Siri revamped a couple of years ago, they specifically talked about having the on-device model handle everything it can, and only using the cloud models for the harder or more open ended questions.