Comment by theshrike79
Comment by theshrike79 4 days ago
The point in these models isn't to have all the knowledge in the world available.
It's to understand enough of language to figure out which tools to call.
"What's my agenda for today" -> get more context
cal = getCalendar() getWeather(user.location()) getTraffic(user.location(), cal[0].location)
etc.
Then grab the return values from those and output:
"You've got a 9am meeting in Foobar, the traffic is normal and it looks like it's going to rain after the meeting."
Not rocket science and not something you'd want to feed to a VC-powered energy-hogging LLM when you can literally run it in your pocket.
Isn't this what Apple tried with Siri? I don't see anyone use it, and adding an LLM to the mix is going to make it less accurate.