Comment by CharlesW

Comment by CharlesW a day ago

44 replies

> I'm oversimplifying but this effectively turns the iPhone into a dumb terminal for Google's brain, wrapped in Apple's privacy theater.

Setting aside the obligatory HN dig at the end, LLMs are now commodities and the least important component of the intelligence system Apple is building. The hidden-in-plain-sight thing Apple is doing is exposing all app data as context and all app capabilities as skills. (See App Intents, Core Spotlight, Siri Shortcuts, etc.)

Anyone with an understanding of Apple's rabid aversion to being bound by a single supplier understands that they've tested this integration with all foundation models, that they can swap Google out for another vendor at any time, and that they have a long-term plan to eliminate this dependency as well.

> Apple admitted that the cost of training SOTA models is a capex heavy-lift they don't want to own.

I'd be interested in a citation for this (Apple introduced two multilingual, multimodal foundation language models in 2025), but in any case anything you hear from Apple publicly is what they want you to think for the next few quarters, vs. an indicator of what their actual 5-, 10-, and 20-year plans are.

dktp 21 hours ago

My guess is that this is bigger lock-in than it might seem on paper.

Google and Apple together will posttrain Gemini to Apple's specification. Google has the know-how as well as infra and will happily do this (for free ish) to continue the mutually beneficial relationship - as well as lock out competitors that asked for more money (Anthropic)

Once this goes live, provided Siri improves meaningfully, it is quite an expensive experiment to then switch to a different provider.

For any single user, the switching costs to a different LLM are next to nothing. But at Apple's scale they need to be extremely careful and confident that the switch is an actual improvement

  • TheOtherHobbes 20 hours ago

    It's a very low baseline with Siri, so almost anything would be an improvement.

    • anamexis 18 hours ago

      The point is that once Siri is switched to a Gemini-based model, the baseline presumably won't be low anymore.

      • brokencode 15 hours ago

        I’m not so sure. Just think about coding assistants with MCP based tools. I can use multiple different models in GitHub Copilot and get good results with similarly capable models.

        Siri’s functionality and OS integration could be exposed in a similar, industry-standard way via tools provided to the model.

        Then any other model can be swapped in quite easily. Of course, they may still want to do fine tuning, quantization, performance optimization for Apple’s hardware, etc.

        But I don’t see why the actual software integration part needs to be difficult.

      • inferiorhuman 13 hours ago

        Doubt it. Of all the issues I run into with Siri none could be solved by throwing AI slop at it. Case in point: if I ask Siri to play an album and it can't match the album name it just plays some random shit instead of erroring out.

    • eastbound 19 hours ago

      Ollama! Why didn’t they just run Ollama and a public model! They’ve kept the last 10 years with a Siri who doesn’t know any contact named Chronometer only to require the best in class LLM?

      • chankstein38 19 hours ago

        The other day I was trying to navigate to a Costco in my car. So I opened google maps on Android Auto on the screen in my car and pressed the search box. My car won't allow me to type even while parked... so I have to speak to the Google Voice Assistant.

        I was in the map search, so I just said "Costco" and it said "I can't help with that right now, please try again later" or something of the sort. I tried a couple more times until I changed up to saying "Navigate me to Costco" where it finally did the search in the textbox and found it for me.

        Obviously this isn't the same thing as Gemini but the experience with Android Auto becomes more and more garbage as time passes and I'm concerned that now we're going to have 2 google product voice assistants.

        Also, tbh, Gemini was great a month ago but since then it's become total garbage. Maybe it passes benchmarks or whatever but interacting with it is awful. It takes more time to interact with than to just do stuff yourself at this point.

        I tried Google Maps AI last night and, wow. The experience was about as garbage as you can imagine.

      • crazygringo 15 hours ago

        I'm genuinely curious about this too. If you really only need the language and common sense parts of an LLM -- not deep factual knowledge of every technical and cultural domain -- then aren't the public models great? Just exactly what you need? Nobody's using Siri for coding.

        Are there licensing issues regarding commercial use at scale or something?

      • JumpCrisscross 12 hours ago

        > Why didn’t they just run Ollama and a public model

        Same reason they switched to Intel chips in the 2000s. They were better. Then Cupertino watched. And it learned. And it leapfrogged.

        If I were Google, my fear would be Apple launching and then cutting the line at TSMC to mass produce custom silicon in the 2030s.

  • ChrisMarshallNY 13 hours ago

    > provided Siri improves meaningfully

    Not a high bar…

    That said, Apple is likely to end up training their own model, sooner or later. They are already in the process of building out a bunch of data centers, and I think they have even designed in-house servers.

    Remember when iPhone maps were Google Maps? Apple Maps have been steadily improving, to the point they are as good as, if not better than, Google Maps, in many areas (like around here. I recently had a friend send me a GM link to a destination, and the phone used GM for directions. It was much worse than Apple Maps. After a few wrong turns, I pulled over, fed the destination into Apple Maps, and completed the journey).

hadlock a day ago

> what their actual 5-, 10-, and 20-year plans are

Seems like they are waiting for the "slope of enlightenment" on the gartner hype curve to flatten out. Given you can just lease or buy a SOTA model from leading vendors there's no advantage to training your own right now. My guess is that the LLM/AI landscape will look entirely different by 2030 and any 5 year plan won't be in the same zip code, let alone playing field. Leasing an LLM from Google with a support contract seems like a pretty smart short term play as things continue to evolve over the next 2-3 years.

  • IgorPartola 19 hours ago

    This is the key. The real issue is that you don’t need superhuman intelligence in a phone AI assistant. You don’t need it most of the time in fact. Current SOTA models do a decent job of approximating college grad level human intelligence let’s say 85% of the time which is helpful and cool but clearly could be better. But the pace at which the models are getting smart is accelerating AND they are getting more energy efficient and memory efficient. So if something like DeepSeek is roughly 2 years behind SOTA models from Google and others who have SOTA models then in 2030 you can expect 2028 level performance out open models. There will come a time when a model capable of college grad level intelligence 99.999% of the time will be able to run on a $300 device. If you are Apple you do not need to lead the charge on a SOTA model, you can just wait until one is available for much cheaper. Your product is the devices and services consumers buy. If you are OpenAI you have no other products. You must become THE AI to have in an industry that will in the next few years become dominated by open models that are good enough or to close up shop or come up with another product that has more of a moat.

    • ipaddr 19 hours ago

      "pace at which the models are getting smart is accelerating". The pace is decelerating.

      • slwvx 17 hours ago

        My impression is that solar (and maybe wind?) energy have benefited from learning-by-doing [1][2] that has resulted in lower costs and/or improved performance each year. It seems reasonable to me that a similar process will apply to AI (at least in the long run). The rate of learning could be seen as a "pace" of improvement. I'm curious, do you have a reference for the deceleration of pace that you refer to?

        [1] https://emp.lbl.gov/news/new-study-refocuses-learning-curve

        [2] https://ourworldindata.org/grapher/solar-pv-prices-vs-cumula...

      • crazygringo 15 hours ago

        I don't think anyone really knows, because there's no objective standard for determining progress.

        Lots of benchmarks exist where everyone agrees that higher scores are better, but there's no sense in which going from a score of 400 to 500 is the same progress as going from 600 to 700, or less, or more. They only really have directional validity.

        I mean, the scores might correspond to real-world productivity rates in some specific domain, but that just begs the question -- productivity rates on a specific task are not intelligence.

    • jimbokun 11 hours ago

      $300 college student in your pocket sure sounds like the Singularity to me.

VirusNewbie 17 hours ago

          LLMs are now commodities and the least important component of the intelligence system Apple is building

If that was even remotely true, Apple, Meta, and Amazon would have SoTA foundational models.
  • Majromax 16 hours ago

    Why? Grain is a commodity, but I buy flour at the store rather than grow my own. The “commmodity” argument suggets that new companies should stay away from model training unless they have a cost edge.

    • VirusNewbie 15 hours ago

      Are you not aware that all of the above have all invested billions trying to train a SoTA Foundational model?

bigyabai 21 hours ago

That's not an "obligatory HN dig" though, you're in-media-res watching X escape removal from the App Store and Play Store. Concepts like privacy, legality and high-quality software are all theater. We have no altruists defending these principles for us at Apple or Google.

Apple won't switch Google out as a provider for the same reason Google is your default search provider. They don't give a shit about how many advertisements you're shown. You are actually detached from 2026 software trends if you think Apple is going to give users significant backend choices. They're perfectly fine selling your attention to the highest bidder.

  • theshrike79 17 hours ago

    There are second-order effects of Google or Apple removing Twitter from their stores.

    Guess who's the bestie of Twitter's owner? Any clues? Could that be a vindictive old man with unlimited power and no checks and balances to temper his tantrums?

    Of course they both WANT Twitter the fuck out of the store, but there are very very powerful people addicted to the app and what they can do with it.

    • bigyabai 15 hours ago

      That further proves my point that they are monopolies that cannot survive without protectionist intervention.

      • mschuster91 14 hours ago

        In the current US environment, no one can survive going against Trump, and as recently evidenced, this is meant literally.

        The US, for all intents and purposes, is now a kleptocracy. Rule of law, freedom of speech, even court orders, all of that doesn't matter any more in practice. There will always be some way for the federal government to strong-arm anyone into submission.

  • kennywinker 19 hours ago

    Caveat: as long as it doesn’t feel like you’re being sold out.

    Which is why privacy theatre was an excellent way to put it

  • yunohn 19 hours ago

    Apple’s various privileged device-level ads and instant-stop-on-cancel trials and special rules for notifications for their paid additional services like Fitness+, Music, Arcade, iCloud+, etc are all proof that they do not care about the user anymore.