Comment by noduerme

Comment by noduerme 9 hours ago

1 reply

I thought it was interesting that a Google flack stressed that the model would run on Apple's compute, and seemed to imply it might even run on-device. Allegedly this was said to allay the (expected) privacy concerns of Apple users who wouldn't want their Siri convos shared with Google.

But I saw something else in that statement. Is there going to be some quantized version of Gemini tailored to run on-device on an M4? If so, that would catapult Apple into an entirely new category merging consumer hardware with frontier models.

sublimefire 7 hours ago

You can already run quantized models without much friction, people also have dedicated apps for that. It changes very little for people because they everyone who wanted to do it already solved it and those who do not they dont care. It is marginal gain from consumer, a feature to brag about for apple, big gain for google. Users also would need to change existing habits which is undoubtedly hard to do.