Comment by MontyCarloHall

Comment by MontyCarloHall a day ago

3 replies

Because their focus on user privacy makes it difficult for them to train at scale on users' data in the way that their competitors can. Ironically, this focus on privacy initially stemmed from fumbling the ball on Siri: recall that Apple never made privacy a core selling point until it was clear that Siri was years behind Google's equivalent, which Apple then retroactively tried to justify by claiming "we keep your data private so we can't train on it the way Google can." The result was a vicious cycle: initially botch AI rollout -> justify that failure with a novel marketing strategy around privacy that only makes it harder to improve their AI capabilities -> botch subsequent AI rollouts as a result -> ...

To be clear, I'd much rather have my personal cloud data private than have good AI integration on my devices. But strictly from an AI-centric perspective, Apple painted themselves into a corner.

potamic a day ago

That's a poor justification. There are companies that sell you all kinds of labelled data. OpenAI, Anthropic didn't train on their own user data.

tensor a day ago

This is nonsense. You don't need Apple user data to build a good AI model, plenty of startups building base models have shown that. But even if you did it's nonsense as Apple has long had opt-in for providing data to train their machine learning models, and many of those models, like OCR or voice recognition, are excellent.

wat10000 a day ago

Apple's privacy focus started long before the current AI wave. It got major public attention in the fight with the FBI over unlocking the San Bernardino shooter's phone. I don't think Google's equivalent even existed at that point.