Apple picks Gemini to power Siri
(cnbc.com)965 points by stygiansonic a day ago
965 points by stygiansonic a day ago
It's clear they don't have the in-house expertise to do it themselves. They aren't an AI player. So it's not a mistake, just a necessity.
Maybe someday they'll build their own, the way they eventually replaced Google Maps with Apple Maps. But I think they recognize that that will be years away.
Apple has been using ML in their products for years, to the point that they dedicated parts of their custom silicon for it before the LLM craze. They clearly have some in-house ML talent, but I suppose LLM talent may be a different question.
I’m wondering if this is a way to shift blame for issues. It was mentioned in an interview that what they built internally wasn’t good enough, presumably due to hallucinations… but every AI does that. They know customers have a low tolerance for mistakes and any issues will quickly become a meme (see the Apple Maps launch). If the technology is inherently flawed, where it will never live up to their standards, if they outsource it, they can point to Google as the source of the failings. If things get better down the road and they can improve by pivoting away from Google, they’ll look better and it will make Google look bad. This could be the long game.
They may also save a fortune in training their own models, if they don’t plan to directly try to monetize the AI, and simply have it as a value add for existing customers. Not to mention staying out of hot water related to stealing art for training data, as a company heavily used by artists.
I agree that they don't appear poised to do it themselves. But why not work with Meta or OpenAI (maybe a bit more questionable with MS) or some other player, rather than Google?
The optics of working with Meta make it a non-starter. Apple symbolizes privacy, Meta the opposite.
With OpenAI, will it even be around 3 years from now, without going bankrupt? What will its ownership structure look like? Plus, as you say, the MS aspect.
So why not Google? It's very common for large corporations to compete in some areas and cooperate in others.
SORRY TO EVERYONE ELSE FOR GOING OFF TOPIC.
I didn't see you 41 day old reply to me until it was too late to comment on it. So here's a sarcastic "thanks for ignoring what I wrote" and telling me that exactly what I was complaining about is the solution to the problem I was complaining about.
https://news.ycombinator.com/item?id=46114935
1) I told you my household can't use Target or Amazon for unscented products, without costly remediation measures, BECAUSE EVEN SCENT-FREE ITEMS COME SMELLING FROM PERFUME CROSS-CONTAMINATION THANKS TO CLEANING, STORAGE, AND TRANSPORTATION CONDITIONS. SOMETIMES REALLY BADLY.
FFS. If you are going to respond, first read.
I also mentioned something other than "government intervention to dictate how products are made" as a solution to this issue, namely adequate segregation between perfumed and non-perfumed products.
And I care less about my wallet than I do about my time and actual ability to acquire products that are either truly scent free, or like yesteryear, don't have everlasting fragrance fixatives.
For people in my position, which make up a small percentage of the population (that still numbers in the millions), the free market has failed. We are a specialized niche that trades tips on how to make things tolerable.
SORRY TO EVERYONE ELSE FOR GOING OFF TOPIC.
Apple has surprisingly good quality AI papers, a lot of work on bridging research and product.
> AI is such a core part of the experience
For who? Regular people are quite famously not clamouring for more AI features in software. A Siri that is not so stupendously dumb would be nice, but I doubt it would even be a consideration for the vast majority of people choosing a phone.
Web search is a core part of browsing and Apple is Google's biggest competitor in browsers. Google is paying Apple about 25x for integrating Google Search in Safari as Apple will be paying Google to integrate Google's LLMs into Siri. If you think depending on your competitor is a problem, you should really look into web search where all the real money is today.
Pity they don't have their own thing at that level. They had a great start introducing Siri then totally missed the train.
Someone inside and up there neglected the "a bicycle for the mind" part of the vision.
I thought it was interesting that a Google flack stressed that the model would run on Apple's compute, and seemed to imply it might even run on-device. Allegedly this was said to allay the (expected) privacy concerns of Apple users who wouldn't want their Siri convos shared with Google.
But I saw something else in that statement. Is there going to be some quantized version of Gemini tailored to run on-device on an M4? If so, that would catapult Apple into an entirely new category merging consumer hardware with frontier models.
You can already run quantized models without much friction, people also have dedicated apps for that. It changes very little for people because they everyone who wanted to do it already solved it and those who do not they dont care. It is marginal gain from consumer, a feature to brag about for apple, big gain for google. Users also would need to change existing habits which is undoubtedly hard to do.
Apple seems positioning this announcement as "on-device intelligence", with the caveat that Apple is promoting their Private Cloud Compute as "on-device" or at least "on-device-like". I'd be curious if they do a breakdown of what they expect to do on-device versus what happens in Private Cloud Compute for this Siri project. I'm a little on the fence if Private Cloud Compute counts as "on-device" as well, but I'm hopeful it is a good idea and is as well built/considered as its documentation says it is.
Gemini has the nano models that run on-device as well, right?
To me, Apple's ML business was all about federated learning. I know this concept was pre-transformer era, I conjectured one of the reasons Apple didn't adopt LLM right away was that Apple couldn't find a reasonable way to do federated learning with LLMs. I wonder Apple will give up this idea. And I would like to see how it could be done with current AI systems.
Apple and Google already have the search relationship. Makes sense for this to happen. Am curious what kind of data Google gets out of the deal.
My experience with Gemini (3 Flash) has been pretty funny, not awful (but worse than Kimi K2 or GPT 5.2 Mini), but it's just so much worse at (or rather hyper focused on) following my custom instructions, I keep getting responses like:
The idiomatic "British" way of doing this ...
Alternatively, for an Imperial-style approach, ...
As a professional software engineer you really should ...
in response to programming/Linux/etc. questions!(Because I just have a short blurb about my educational background, career, and geography in there, which with every other model I've tried works great to ensure British spelling, UK information, metric units, and cut the cruft because I know how to mkdir etc.)
It's given me a good laugh a few times, but just about getting old now.
Old news now I think, but good news. Except for my Apple Watch I have given up using Siri, but I use Gemini and think it is good in general, and awesome on my brother's Pixel phone.
Because Apple Silicon is so good for LLM inferencing, I hope they also do a deal for small on-device Gemma models.
Feel like this is a huge whiff. At best your implementation is as good as what you can get with Google's own offerings and Pixel. Most likely Apple's offerings will always be just a bit behind. Im probably a bit biased as Ive preferred Anthropic but it seems like those two companies also align more with other outward policies like privacy.
Really, Siri is an agent. Agents thrive when the subjacent model capabilities are higher, as it unlocks a series of other use cases that are hard to accomplish when the basic Natural Language Processing layer is weak.
The better the basic NLP tasks like named entity recognition, PoS tagging, Dependency Parsing, Semantic Role Labelling, Event Extraction, Constituency parsing, Classification/Categorization, Question Answering, etc, are implemented by the model layer, the farther you can go on implementing meaningful use-cases in your agent.
Apple can now concentrate on making Siri a really useful and powerful agent.
This is good for Siri, in many ways. But I was kind of hoping we would see a time soon when phone hardware became good enough to do nearly 100% of the Siri-level tasks locally rather than needing Internet access.
I suspect we'll see that; but Siri is in such a bad state of disrepair that Apple really needs something now while they continue to look for micro-scale LLM models that can run well-enough locally. The two things aren't mutually exclusive.
The biggest thing Apple has to do is get a generic pipeline up and running, that can support both cloud and non-cloud models down the road, and integrate with a bunch of local tools for agent-style workloads (e.g. "restart", "audio volume", "take screenshot" as tools that agents via different cloud/local models can call on-device).
I'd hope it could be the other way around. Some stuff should be relatively straightforward -- summarizing notifications, emails, setting timers, things like that should be obviously on-device. But aside from that, I would hope that the on-device AI can make the determination on whether it is necessary to go to a datacenter AI for a better answer.
But you may be right, maybe on-device won't be smart enough to decide it isn't smart enough. Though it does seem like the local LLMs have gotten awfully good.
I can see them going that route, but it would cause similarly annoying breaks in the flow as current Siri offering to delegate to ChatGPT, or on-device Siri deciding it can do the task but actually failing or doing it wrong. It certainly wouldn’t be an “it just works” experience.
That was quick: https://www.macrumors.com/2026/01/12/elon-musk-reacts-to-gem...
I was not aware HN is now an investment discussion board. Even if you were to argue that point, what’s his incremental value comped to but-for world? I mean one where Steve Jobs is still alive and running Apple. I am sure Jobs would’ve sat on his behind milking iPhones and just let Google, Microsoft, Meta and Nvidia take the entire AI TAM. I am sure that’s the Steve Jobs we all knew.
There are many dimensions one could assess a management team, but it is obviously ridiculous to call them “a joke” when they achieve their principal goal at such astonishing scale.
I am certain Apple will do just fine in the AI revolution, in large part because such a massive distribution and brand advantage is extremely hard to overcome.
Tim Cook replaced paying customers with Wall St. That's not a win for any of us except Apple shareholders, a number far, far lower than Apple users.
I wonder if we will see them take the final step and just make Gemini the default AI assistant on iPhone.
Might sound crazy but remember they did exactly this for web search. And Maps as well for many years.
This way they go from having to build and maintain Siri (which has negative brand value at this point) and pay Google's huge inference bills to actually charging Google for the privilege.
This morning I was wondering what happened to whatever arrangement I thought Apple had with OpenAI. In a way I think OpenAI is a competitor and “new money”. Pairing with Google makes sense especially considering that this is “normie-facing” technology. And from what I recall, a lot of Apple fans prefer “Hey Google” in their cars over CarPlay. Or something to that effect.
Does anyone know what Apple's "Private Cloud Compute" servers actually are? I recall murmurings about racked M chips or some custom datacenter-only variant?
I'm really curious how Apple is bridging the gap between consumer silicon and the datacenter scale stack they must have to run a customized Gemini model for millions of users.
RDMA over Thunderbolt is cool for small lab clusters but they must be using something else in the datacenter, right?
I don't understand why Apple cannot implement their own LLM at the user phone level for easy pickings? like settings control? or app-specific shortcuts? or local data searching?
I understand other things like image recognition, wikipedia information, etc require external data sets, and transferring over local data to that end can be a privacy breach. But the local stuff should be easy, at least in one or two languages.
All signs are that they are doing exactly that. They already have an on-device LLM which powers certain features, and I expect they will have a better-trained version of that on-device model that comes out with the "new Siri" update.
In the original announcement of the Siri revamped a couple of years ago, they specifically talked about having the on-device model handle everything it can, and only using the cloud models for the harder or more open ended questions.
I’m a long time Android user and almost switched to iPhone last year. Mostly because I use macOS and wanted better integration and also wanted to try it. Another big factor was the AI assistant. I stayed with Android because I think Google will win here. Apple will probably avoid losing users to their biggest competitor by reaching rough parity using the same models
Why is Apple's self-developed AI progressing so slowly that they still need to collaborate with Google? What's going on with their AI team? In this era of AI, it seems like Apple has already fallen behind.
Why is Apple's self-developed AI progressing so slowly that they need to collaborate with Google? What's going on with their AI team? In this era of AI, it seems like Apple has already fallen behind.
"Google already pays Apple billions each year to be the default search engine on iPhones. But that lucrative partnership briefly came into question after Google was found to hold an illegal internet search monopoly.
In September, a judge ruled against a worst-case scenario outcome that could have forced Google to divest its Chrome browser business.
The decision also allowed Google to continue to make deals such as the one with Apple."
How much is Google paying Apple now
If these anti-competitive agreements^1 were public,^2 headlines could be something like,
(A) "Apple agrees to use Google's Gemini for AI-powered Siri for $[payment amount]"
Instead, headlines are something like,
(B) "Apple picks Google's Gemini to run Ai-powered Siri"
1. In other words, they are exclusive and have anticompetitive effects
2. Neither CNBC nor I are suggesting that there is any requirement for the parties to make these agreements public. I am presenting a hypothetical relating to headlilnes, (A) versus (B), as indicated by the words "If" and "could"
https://www.cnet.com/tech/services-and-software/apple-to-rep...
Google pays 20 billion to Apple annually for search traffic
Apple allegedly pays Google about 1 billion per year for Gemini
Perhaps Gemini sends more search traffic to Google
The search traffic and data collection is worth far more than Gemini
It's probably anti-competitive, but I'm not sure about your argument which is that Apple and Google must disclose details of their business relationships just because they are Apple and Google. You could maybe argue something like this should be a requirement of publicly traded companies, but the long-term effect there would be fewer publicly traded companies so they don't have to disclose every deal they make.
Looks like Apple is getting paid $1 billion annually to use Gemini for AI-powered Siri
https://www.bloomberg.com/news/articles/2025-12-01/openai-ta...
Why they are constantly so bad at AI but so good at everything else?
Because their focus on user privacy makes it difficult for them to train at scale on users' data in the way that their competitors can. Ironically, this focus on privacy initially stemmed from fumbling the ball on Siri: recall that Apple never made privacy a core selling point until it was clear that Siri was years behind Google's equivalent, which Apple then retroactively tried to justify by claiming "we keep your data private so we can't train on it the way Google can." The result was a vicious cycle: initially botch AI rollout -> justify that failure with a novel marketing strategy around privacy that only makes it harder to improve their AI capabilities -> botch subsequent AI rollouts as a result -> ...
To be clear, I'd much rather have my personal cloud data private than have good AI integration on my devices. But strictly from an AI-centric perspective, Apple painted themselves into a corner.
This is nonsense. You don't need Apple user data to build a good AI model, plenty of startups building base models have shown that. But even if you did it's nonsense as Apple has long had opt-in for providing data to train their machine learning models, and many of those models, like OCR or voice recognition, are excellent.
it's pretty Apple-ish to not jump into a frenzy, and wait for turbulence to settle, i believe. delegation to Gemini fits that theory?
They've tried to have an AI assistant before AI was a big thing...it's just pretty bad and Siri never got better.
If it would suddenly get better, like they teased (Some would say, lied about the capabilities) with Apple Intelligence that would fit pretty well. That they delegate that to Gemini now is a defeat.
Gemini only replaced Google assistant on Android a few weeks ago. I gave up on Google assistant a few years ago, but I'd guess it wasn't a worthwhile upgrade from Siri.
Apple is almost purely customer products, they don't have the resources to compete with the giants in this field.
Their image classification happens on-device, in comparison Google Photos does that server side so they already have ML infra.
I think that's the thing: Apple is good at very little, but they seem like they're good at "everything else" because they don't do much else. Lots of companies spread themselves really thin trying to get into lots of unrelated competencies and tons of products. Apple doesn't.
Why does a MacBook seem better than PC laptops? Because Apple makes so few designs. When you make so few things, you can spend more time refining the design. When you're churning out a dozen designs a year, can you optimize the fan as well for each one? You hit a certain point where you say "eh, good enough." Apple's aluminum unibody MacBook Pro was largely the same design 2008-2021. They certainly iterated on it, but it wasn't "look at my flashy new case" every year. PC laptop makers come out with new designs with new materials so frequently.
With iPhones, Apple often keeps a design for 3 years. It looks like Samsung has churned out over 25 phone models over the past year while Apple has 5 (iPhone, iPhone Plus, iPhone Pro, iPhone Pro Max, iPhone 16e).
It's easy to look so good at things when you do fewer things. I think this is one of Apple's great strengths - knowing where to concentrate its effort.
This is some magical thinking. Even if Samsung took all their manpower, all their thought process and all their capital, they still couldn’t produce a laptop that competes with the MacBook (just to take one example), because they fundamentally don’t have any taste as a company.
Hell, they can’t even make a TV this year that’s less shit than last years version of it and all that requires is do literally nothing.
I feel like this ignores how big of a part the software is for those "consumer electronics" Apple is so good at making.
Apple definitely has software expertise, maybe it's not as specialized into AI as it is about optimizing video or music editors, but to suggest they'd be at the same starting point as an agriculture endeavor feels dishonest.
It's been a long running thing that Apple can't do software as well as competitors, though in my experience they've beat Google and a few others at devex and UX in their mobile frameworks overtime despite initial roughness. Slow and steady might win this race eventually, too.
Google's strategy is as unreadable as ever. It feels like two companies fighting each other in one.
On the one hand, they apparently want to be a service provider Microsoft-style. They are just signing a partnership with their biggest competitor and giving them access to their main competitive advantage, the most advanced AI available.
On the other hand, they want to be another Apple. They are locking down their phone. Are competing with the manufacturers of the best Android phones. Are limiting the possibility of distributing software on their system. Things that were their main differentiator.
It doesn't make sense. It's also a giant middle finger to the people who bought the Pixel for Gemini. Congrats, you were beta testers for iPhone users who won't have to share their data with Google for training Gemini. I have rarely seen a company as disrespectful to its customer.
I wonder if this will my original homepods interesting to talk to or if they won't provide this on older devices.
given that gemini 3 pro is presumably a relatively small model it wouldn't be too surprising to see an even more optimized model fit into latest iphones. I wish we knew the data behind gemini 3 flash because if my estimation that it's <50b is true, holy shit.
It tells you how bad their product management and engineering team is that they haven’t just decided to kill Siri and start from scratch. Siri is utterly awful and that’s an understatement, for at least half a decade.
This seems like a pretty significant anti-trust issue. One of the two mobile OS makers is using a product from the other for its AI assistance. And that means that basically all mobile devices will be using the same AI technology.
I don't expect the current US government to do anything about it though.
What antitrust rule do you think would be breached?
I admit I don't see the issue here. Companies are free to select their service providers, and free to dominate a market (as long as they don't abuse such dominant position).
Gatekeeping - nobody else can be the default voice assistant or power Siri, so where does this leave eg OpenAI? The reason this is important is their DOJ antitrust case, about to start trial, has made this kind of conduct a cornerstone of their allegations that Apple is a monopoly.
It also lends credence to the DOJ's allegation that Apple is insulated from competition - the result of failing to produce their own winning AI service is an exclusive deal to use Google while all competing services are disadvantaged, which is probably not the outcome a healthy and competitive playing field would produce.
So because Apple chose not to spend money to develop it's own AI, it must be punished for then choosing to use another companies model? And the reason that this is an issue is because both companies are large?
This feels a little squishy... At what size of each company does this stop being an antitrust issue? It always just feels like a vibe check, people cite market cap or marketshare numbers but there's no hard criteria (at least that I've seen) that actually defines it (legally, not just someones opinion).
The result of that is that it's sort of just up to whoever happens to be in charge of the governing body overseeing the case, and that's just a bad system for anyone (or any company) to be subjected to. It's bad when actual monopolistic abuse is happening and the governing body decides to let it slide, and it's bad when the governing body has a vendetta or directive to just hinder certain companies/industries regardless of actual monopolistic abuse.
> Gatekeeping - nobody else can be the default voice assistant or power Siri, so where does this leave eg OpenAI?
Sorry if I'm missing the point but if Apple had picked OpenAI, couldn't you have made the same comment? "nobody else can be the default voice assistant or power Siri, so where does this leave eg Gemini/Claude?".
They are in a duopoly on the Mobile OS market, with no other significant player available. Google would be the sole integrated mobile AI, though there are competitors available if customers wanted to switch (customers for such products being the OS companies buying the AI services, not the end-users).
However I don't see the link, how they are "using their duopoly", and why "they" would be using it but only one of them benefits from it. Being a duopoly, or even a monopoly, is not against anti-trust law by itself.
ChatGPT is currently integrated into Apple Intelligence. When I ask Siri something I can choose to use ChatGPT for my answer.
https://support.apple.com/guide/iphone/use-chatgpt-with-appl...
So I'm guessing in a future update it will be Gemini instead. I hope it's going to be more of an option to choose between the 2.
How Apple has made some of the greatest phones in history, amazing engineering, and a lot more, but just can't make a simple model to run locally on the phone when many others did?
i think it's good. Google has a record of being stable and working with large partners (govt etc) and avoids the controversial cult of altman.
If they wanted to, they could throw massive amounts of cash on it like Google and Facebook are, with the latter poaching Apple employees with 200$ million pay packages: https://www.bloomberg.com/news/articles/2025-07-09/meta-poac...
But why on earth would they do that? It's both cheaper and safer to buy Google's model, with whom they already have a longstanding relationship. Examples include the search engine deal, and using Google Cloud infrastructure for iCloud and other services. Their new "private cloud compute" already runs on GCP too, perfect! Buying Gemini just makes sense, for now. Wait a few years until the technology becomes more mature/stable and then replace it with their own for a reasonable price.
No, they couldn't, because all current and future ethe training hardware is already tied up by contracts from the frontier labs. Apple could not simply buy its way in given how constricted the supply is.
Can someone explain to me how this was allowed to happen? Wasn't Siri supposed to be the leading AI agent not ten years ago? How was there such a large disconnect at Apple between what Siri could do and what "real" AI was soon to be capable of?
Was this just a massive oversight at Apple? Were there not AI researchers at Apple sounding the alarm that they were way off with their technology and its capabilities? Wouldn't there be talk within the industry that this form of AI assistant would soon be looked at as useless?
Am I missing something?
Source: while I don’t have any experience with the inner workings of Siri, I have extensive experience with voice based automation with call centers (Amazon Connect) and Amazon Lex (the AWS version of Alexa).
Siri was never an “AI agent”, with intent based systems, you give the system phrases to match on (intents) and to fulfill an intent, all of the “slots” have to be fulfilled. For instance “I want to go from $source to $destination” and then the system calls an API.
There is no AI understanding - it’s a “1000 monkeys implementation”, you just start giving the system a bunch of variations and templates you want to match on in every single language you care about and match the intents to an API. That’s how Google and Alexa also worked pre LLM. They just had more monkeys dedicated to creating matching sentences.
Post LLM, you tell the LLM what the underlying system is capable of, the parameters the API requires to fulfill an action and the LLM can figure out the users intentions and ask follow up questions until it had enough info to call the API. You can specify the prompt in English and it works in all of the languages that the LLM has been trained on.
Yes I’ve done both approaches
I appreciate the response, but that doesn't really answer my question.
I want to know why the executive leadership at Apple failed to see LLMs as the future of AI. ChatGPT and Gemini are what Siri should be at this point. Siri was one of the leading voice-automated assistants of the past decade, and now Apple's only options are to strap on an existing solution to the name of their product or let it go defunct. So now Siri is just an added layer to access Gemini? Perhaps with a few hard-coded solutions to automate specific tasks on the iPhone, and that's their killer app into the world of AI? That's pathetic.
Is Apple already such a bloated corporation that it can no longer innovate fast enough to keep up with modern trends? It seems like only a few years ago they were super lean and able to innovate better than any major tech company around. LLMs were being researched in 2017. I guess three years was too short of a window to change the direction of Siri. They should have seen the writing on the wall here.
According to everything that has been reported, both the Google Assistant and Alexa are less reliable now that they are LLM based.
I don’t know why, in my much smaller scale experience, converting to an LLM “tools” based approached from the Intent based approach is much more reliable.
Siri was behind pre LLM because Apple didn’t throw enough monkeys at the problem.
Everything that an assistant can do is “hardcoded” even when it is LLM based.
Old way: voice -> text -> pattern matching -> APIs to back end functionality.
New Way: voice -> text -> LLM -> APIs to back end functionality.
How often have you come across a case where Siri understood something and said “I can’t do that”? That’s not an AI problem. That’s Apple not putting people on the intent -> API mapping. An LLM won’t solve the issue of exposing the APIs to Siri.
Previously only back in June the discussion was barely a mention of Google Gemini, and leaned more towards why weren't they doing it themselves:
Apple weighs using Anthropic or OpenAI to power Siri
Guess I am not using Siri anymore…
By the way, have any of you ever tried to delete and disabled Siri’s iCloud backup? You can’t do it.
In the article they clearly mentioned that Gemini model will be used for the Foundation Model running on device or their own Server. They are not sending Siri request to Google servers.
My exact reaction every time I hear people discuss Siri. I don’t think I used it once in my life and it’s one of the first thing I turn off every time I have a new device. So interesting to see how different people use the same devices in completely different ways.
I used it when it launched to figure out it was useless and haven't gone back.
For CarPlay, yes. I don't need a virtual assistant to do things I can do but worse; I need reliable voice controls to send messages, start phone calls, change the map destination and such with as little friction as possible.
Siri needs faster and more flexible handling of Spotify, Google Maps and third-party messaging apps, not a slop generator.
Only for opening/closing the garage door, setting timers, and sending texts. What else do people use the digital assistants for?
I have a current case open with Apple with this issue. It does not work. And I don’t believe you. I’m sorry I just don’t believe you because Apple says there is a technical problem preventing this. That does not just affect me. Because I also tried it on three other phones of three other friends of mine and it does not work.
That's where people get confused - it's not a chatbot or an LLM - it's a voice command interface. Adding something to the shopping list, setting a timer, turning up the heating in the back room, playing some music, skipping a track, sending a message - it works perfectly well for - and that's what I use it for virtually every day.
This work is to turn it it into something else, more like a chatbot, presumably
Siri is already transitioning from an intent-based NLU system to an LLM.
In iOS 18.1 (on iPhone 15+) Siri is part intent-based, part on-device "Apple Intelligence" small LLM, and in iOS 18.2 it also supports off-device ChatGPT.
This year Siri 2.0 is expected to ditch the legacy intent-based system and instead use just the small on-device Apple Intelligence LLM plus (opt-in) off-device Gemini (running in some private cloud).
Google release hints at this being more than just Siri:
> Apple and Google have entered into a multi-year collaboration under which the next generation of Apple Foundation Models will be based on Google's Gemini models and cloud technology. These models will help power future Apple Intelligence features, including a more personalized Siri coming this year.
... https://blog.google/company-news/inside-google/company-annou...
This is actually the most important part of this announcement, and excellent news. I was pretty disappointed that they were going with an existing player rather than building their own models. But this implies that they will continue to build their own base models, just using Gemini as a starting point, which is a pretty good solution.
But, but, didn’t they say two years ago that they picked OpenAI to power apple intelligence? Where did that go? Wasn’t Scam Altman gay enough for Tim Apple?
Why couldn't Apple pull their finger out of their asses and make their own AI nonsense better then Crap GPT?
Enemies? Google contributes about 20% of Apple's profits annually through their default search engine deal, that's more profitable than just about everything they do or make except selling iPhones.
> The U.S. government said Apple Chief Executive Officer Tim Cook and Google CEO Sundar Pichai met in 2018 to discuss the deal. After that, an unidentified senior Apple employee wrote to a Google counterpart that “our vision is that we work as if we are one company.”
https://www.bloomberg.com/news/articles/2020-10-20/apple-goo...
Somewhat surprising. AI is such a core part of the experience. It feels like a mistake to outsource it to arguably your biggest competitor.