Fiveplus a day ago

The writing was on the wall the moment Apple stopped trying to buy their way into the server-side training game like what three years ago?

Apple has the best edge inference silicon in the world (neural engine), but they have effectively zero presence in a training datacenter. They simply do not have the TPU pods or the H100 clusters to train a frontier model like Gemini 2.5 or 3.0 from scratch without burning 10 years of cash flow.

To me, this deal is about the bill of materials for intelligence. Apple admitted that the cost of training SOTA models is a capex heavy-lift they don't want to own. Seems like they are pivoting to becoming the premium "last mile" delivery network for someone else's intelligence. Am I missing the elephant in the room?

It's a smart move. Let Google burn the gigawatts training the trillion parameter model. Apple will just optimize the quantization and run the distilled version on the private cloud compute nodes. I'm oversimplifying but this effectively turns the iPhone into a dumb terminal for Google's brain, wrapped in Apple's privacy theater.

  • CharlesW a day ago

    > I'm oversimplifying but this effectively turns the iPhone into a dumb terminal for Google's brain, wrapped in Apple's privacy theater.

    Setting aside the obligatory HN dig at the end, LLMs are now commodities and the least important component of the intelligence system Apple is building. The hidden-in-plain-sight thing Apple is doing is exposing all app data as context and all app capabilities as skills. (See App Intents, Core Spotlight, Siri Shortcuts, etc.)

    Anyone with an understanding of Apple's rabid aversion to being bound by a single supplier understands that they've tested this integration with all foundation models, that they can swap Google out for another vendor at any time, and that they have a long-term plan to eliminate this dependency as well.

    > Apple admitted that the cost of training SOTA models is a capex heavy-lift they don't want to own.

    I'd be interested in a citation for this (Apple introduced two multilingual, multimodal foundation language models in 2025), but in any case anything you hear from Apple publicly is what they want you to think for the next few quarters, vs. an indicator of what their actual 5-, 10-, and 20-year plans are.

    • dktp 21 hours ago

      My guess is that this is bigger lock-in than it might seem on paper.

      Google and Apple together will posttrain Gemini to Apple's specification. Google has the know-how as well as infra and will happily do this (for free ish) to continue the mutually beneficial relationship - as well as lock out competitors that asked for more money (Anthropic)

      Once this goes live, provided Siri improves meaningfully, it is quite an expensive experiment to then switch to a different provider.

      For any single user, the switching costs to a different LLM are next to nothing. But at Apple's scale they need to be extremely careful and confident that the switch is an actual improvement

      • TheOtherHobbes 20 hours ago

        It's a very low baseline with Siri, so almost anything would be an improvement.

      • ChrisMarshallNY 13 hours ago

        > provided Siri improves meaningfully

        Not a high bar…

        That said, Apple is likely to end up training their own model, sooner or later. They are already in the process of building out a bunch of data centers, and I think they have even designed in-house servers.

        Remember when iPhone maps were Google Maps? Apple Maps have been steadily improving, to the point they are as good as, if not better than, Google Maps, in many areas (like around here. I recently had a friend send me a GM link to a destination, and the phone used GM for directions. It was much worse than Apple Maps. After a few wrong turns, I pulled over, fed the destination into Apple Maps, and completed the journey).

    • hadlock a day ago

      > what their actual 5-, 10-, and 20-year plans are

      Seems like they are waiting for the "slope of enlightenment" on the gartner hype curve to flatten out. Given you can just lease or buy a SOTA model from leading vendors there's no advantage to training your own right now. My guess is that the LLM/AI landscape will look entirely different by 2030 and any 5 year plan won't be in the same zip code, let alone playing field. Leasing an LLM from Google with a support contract seems like a pretty smart short term play as things continue to evolve over the next 2-3 years.

      • IgorPartola 19 hours ago

        This is the key. The real issue is that you don’t need superhuman intelligence in a phone AI assistant. You don’t need it most of the time in fact. Current SOTA models do a decent job of approximating college grad level human intelligence let’s say 85% of the time which is helpful and cool but clearly could be better. But the pace at which the models are getting smart is accelerating AND they are getting more energy efficient and memory efficient. So if something like DeepSeek is roughly 2 years behind SOTA models from Google and others who have SOTA models then in 2030 you can expect 2028 level performance out open models. There will come a time when a model capable of college grad level intelligence 99.999% of the time will be able to run on a $300 device. If you are Apple you do not need to lead the charge on a SOTA model, you can just wait until one is available for much cheaper. Your product is the devices and services consumers buy. If you are OpenAI you have no other products. You must become THE AI to have in an industry that will in the next few years become dominated by open models that are good enough or to close up shop or come up with another product that has more of a moat.

    • VirusNewbie 17 hours ago

                LLMs are now commodities and the least important component of the intelligence system Apple is building
      
      
      If that was even remotely true, Apple, Meta, and Amazon would have SoTA foundational models.
      • Majromax 16 hours ago

        Why? Grain is a commodity, but I buy flour at the store rather than grow my own. The “commmodity” argument suggets that new companies should stay away from model training unless they have a cost edge.

        • VirusNewbie 15 hours ago

          Are you not aware that all of the above have all invested billions trying to train a SoTA Foundational model?

    • bigyabai 21 hours ago

      That's not an "obligatory HN dig" though, you're in-media-res watching X escape removal from the App Store and Play Store. Concepts like privacy, legality and high-quality software are all theater. We have no altruists defending these principles for us at Apple or Google.

      Apple won't switch Google out as a provider for the same reason Google is your default search provider. They don't give a shit about how many advertisements you're shown. You are actually detached from 2026 software trends if you think Apple is going to give users significant backend choices. They're perfectly fine selling your attention to the highest bidder.

      • theshrike79 17 hours ago

        There are second-order effects of Google or Apple removing Twitter from their stores.

        Guess who's the bestie of Twitter's owner? Any clues? Could that be a vindictive old man with unlimited power and no checks and balances to temper his tantrums?

        Of course they both WANT Twitter the fuck out of the store, but there are very very powerful people addicted to the app and what they can do with it.

      • kennywinker 19 hours ago

        Caveat: as long as it doesn’t feel like you’re being sold out.

        Which is why privacy theatre was an excellent way to put it

      • yunohn 19 hours ago

        Apple’s various privileged device-level ads and instant-stop-on-cancel trials and special rules for notifications for their paid additional services like Fitness+, Music, Arcade, iCloud+, etc are all proof that they do not care about the user anymore.

  • concinds a day ago

    An Apple-developed LLM would likely be worse than SOTA, even if they dumped billions on compute. They'll never attract as much talent as the others, especially given how poorly their AI org was run (reportedly). The weird secrecy will be a turnoff. The culture is worse and more bureaucratic. The past decade has shown that Apple is unwilling to fix these things. So I'm glad Apple was forced to overcome their Not-Invented-Here syndrome/handicap in this case.

    • blitzar a day ago

      Apple might have gotten very lucky here ... the money might be in finding uses, and selling physical products rather than burning piles of cash training models that are SOTA for 5 minutes before being yet another model in a crowded field.

      My money is still on Apple and Google to be the winners from LLMs.

      • illiac786 6 hours ago

        I agree. That’s why I think EU‘s DMA is visionary, even if not perfect. LLM wars will prove EU regulators right I anticipate.

      • lamontcg a day ago

        And when the cost of training LLMs starts to come down to under $1B/yr, Apple can jump on board, having saved >$100B in not trying to chase after everyone else to try to get there first.

      • Melatonic 21 hours ago

        Apple has also never been big on the server side equation of both software and hardware - don't they already outsource most of their cloud stack to Google via GCP ?

        I can see them eventually training their own models (especially smaller and more targeted / niche ones) but at their scale they can probably negotiate a pretty damn good deal renting Google TPUs and expertise.

      • DrewADesign 15 hours ago

        Yeah… there’s this “bro— do you even business?” vibe in the tech world right now pointed at any tech firm not burning oil tankers full of cash (and oil, for that matter,) training a giant model. That money isn’t free — the economic consequences of burning billions to make a product that will be several steps behind, at best, are giant. There’s a very real chance these companies won’t recoup that money if their product isn’t attractive to hoards of users willing to pay more money for AI than anyone currently is. It doesn’t even make them look cool to regular people — their customers hate hearing about AI. Since there are viable third party options available, I think Apple would have to be out of their goddamned minds to try and jump in that race right now. They’re a product company. Nobody is going to not buy an iPhone because they’re using a third-party model.

    • microtherion 19 hours ago

      Reportedly, Meta is paying top AI talent up to $300M for a 4 year contract. As much as I'm in favor of paying engineers well, I don't think salaries like this (unless they are across the board for the company, which they are of course not) are healthy for the company long term (cf. Anthony Levandowski, who got money thrown after him by Google, only to rip them off).

      So I'm glad Apple is not trying to get too much into a bidding war. As for how well orgs are run, Meta has its issues as well (cf the fiasco with its eponymous product), while Google steadily seems to erode its core products.

      • EgregiousCube 15 hours ago

        Why would paying everyone $300M across the board be healthier than using it as a tool to (attempt to) attract the best of the best?

  • maxloh a day ago

    Is the training cost really that high, though?

    The Allen Institute (a non-profit) just released the Molmo 2 and Olmo 3 models. They trained these from scratch using public datasets, and they are performance-competitive with Gemini in several benchmarks [0] [1].

    AMD was also able to successfully train an older version of OLMo on their hardware using the published code, data, and recipe [2].

    If a non-profit and a chip vendor (training for marketing purposes) can do this, it clearly doesn't require "burning 10 years of cash flow" or a Google-scale TPU farm.

    [0]: https://allenai.org/blog/molmo2

    [1]: https://allenai.org/blog/olmo3

    [2]: https://huggingface.co/amd/AMD-OLMo

    • lostmsu 20 hours ago

      No, I doesn't beat Gemini in any benchmarks. It beats Gemma, which isn't a SoTA even among open models of that size. That would be Nemotron 3 or GPT-OSS 20B.

    • turtlesdown11 a day ago

      No, of course the training costs aren't that high. Apple's ten years of future free cash flow is greater than a trillion dollars (they are above $100b per year). Obviously, the training costs are a trivial amount compared to that figure.

      • ufmace 17 hours ago

        What I'm wondering - their future cash flow may be massive compared to any conceivable rational task, but the market for servers and datacenters seems to be pretty saturated right now. Maybe, for all their available capital, they just can't get sufficient compute and storage on a reasonable schedule.

      • bombcar 21 hours ago

        I have no idea what AI involves, but "training" sounds like a one-and-done - but how is the result "stored"? If you have trained up a Gemini, can you "clone" it and if so, what is needed?

        I was under the impression that all these GPUs and such were needed to run the AI, not only ingest the data.

      • amelius 20 hours ago

        Hiring the right people should also be trivial with that amount of cash.

    • PunchyHamster 17 hours ago

      my prediction is that they might switch once AI craze will simmer down to some more reasonable level

  • drob518 a day ago

    Yea, I think it’s smart, too. There are multiple companies who have spent a fortune on training and are going to be increasingly interested in (desperate to?) see a return from it. Apple can choose the best of the bunch, pay less than they would have to to build it themselves, and swap to a new one if someone produces another breakthrough.

    • Fiveplus a day ago

      100%. It feels like Apple is perfectly happy letting the AI labs fight a race to the bottom on pricing while they keep the high-margin user relationship.

      I'm curious if this officially turns the foundation model providers into the new "dumb pipes" of the tech stack?

      • drob518 a day ago

        It’ll be interesting to see how it plays out. The question is, what’s the moat? If all they have is scaling to drive better model performance, then the winner is just whoever has the lowest cost of capital.

      • whywhywhywhy 21 hours ago

        As if they really have a choice though. Competing would be a billion dollar Apple Maps scenario.

  • overfeed 21 hours ago

    > The writing was on the wall the moment Apple stopped trying to buy their way into the server-side training game like what three years ago?

    It goes back much further than that - up until 2016, Apple wouldn't let its ML researchers add author names to published research papers. You can't attract world-class talent in research with a culture built around paranoid secrecy.

    • sumedh an hour ago

      > You can't attract world-class talent in research with a culture built around paranoid secrecy.

      Would giving more money/shares help?

  • ceejayoz a day ago

    > I'm oversimplifying but this effectively turns the iPhone into a dumb terminal for Google's brain, wrapped in Apple's privacy theater.

    This sort of thing didn't work out great for Mozilla. Apple, thankfully, has other business bringing in the revenue, but it's still a bit wild to put a core bit of the product in the hands of the only other major competitor in the smartphone OS space!

    • apercu a day ago

      I dunno, my take is that Apple isn’t outsourcing intelligence rather it’s outsourcing the most expensive, least defensible layer.

      Down the road Apple has an advantage here in a super large training data set that includes messages, mail, photos, calendar, health, app usage, location, purchases, voice, biometrics, and you behaviour over YEARS.

      Let's check back in 5 years and see if Apple is still using Gemini or if Apple distills, trains and specializes until they have completed building a model-agnostic intelligence substrate.

  • aurareturn a day ago

    Seems like there is a moat after all.

    The moat is talent, culture, and compute. Apple doesn't have any of these 3 for SOTA AI.

    • elzbardico a day ago

      It is more like Apple have no need to spend billions on training with questionable ROI when it can just rent from one of the commodity foundation model labs.

      • nosman 20 hours ago

        I don't know why people automatically jump to Apple's defense on this.... They absolutely did spend a lot of money and hired people to try this. They 100% do NOT have the open and bottom-up culture needed to pull off large scale AI and software projects like this.

        Source: I worked there

      • aurareturn 12 hours ago

        It’s such a commodity that there are only 3 SOTA labs left and no one can catch them. I’m sure it’ll be consolidated further in the future and you’re going to be left with a natural monopoly or duopoly.

        Apple has no control over the most important change to tech. They have control to Google.

    • jpfromlondon 18 hours ago

      is it that surprising? they're a hardware company after all.

  • hmokiguess a day ago

    I always think about this, can someone with more knowledge than me help me understand the fragility of these operations?

    It sounds like the value of these very time-consuming, resource-intensive, and large scale operations is entirely self-contained in the weights produced at the end, right?

    Given that we have a lot of other players enabling this in other ways, like Open Sourcing weights (West vs East AI race), and even leaks, this play by Apple sounds really smart and the only opportunity window they are giving away here is "first to market" right?

    Is it safe to assume that eventually the weights will be out in the open for everyone?

    • bayarearefugee 19 hours ago

      > and the only opportunity window they are giving away here is "first to market" right?

      A lot of the hype in LLM economics is driven by speculation that eventually training these LLMs is going to lead to AGI and the first to get there will reap huge benefits.

      So if you believe that, being "first to market" is a pretty big deal.

      But in the real world there's no reason to believe LLMs lead to AGI, and given the fairly lock-step nature of the competition, there's also not really a reason to believe that even if LLMs did somehow lead to AGI that the same result wouldn't be achieved by everyone currently building "State of the Art" models at roughly the same time (like within days/months of each other).

      So... yeah, what Apple is doing is actually pretty smart, and I'm not particularly an Apple fan.

    • pests 19 hours ago

      > is entirely self-contained in the weights produced at the end, right?

      Yes, and the knowledge gained along the way. For example, the new TPUv4 that Google uses requires rack and DC aware technologies (like optical switching fabric) for them to even work at all. The weights are important, and there is open weights, but only Google and the like are getting the experience and SOTA tech needed to operate cheaply at scale.

  • LeoPanthera 18 hours ago

    Google says: "Apple Intelligence will continue to run on Apple devices and Private Cloud Compute, while maintaining Apple's industry-leading privacy standards."

    So what does it take? How many actual commitments to privacy does Apple have to make before the HN crowd stops crowing about "theater"?

  • Sevii 19 hours ago

    Apple's goal is likely to run all inference locally. But models aren't good enough yet and there isn't enough RAM in an iPhone. They just need Gemini to buy time until those problems are resolved.

    • kennywinker 19 hours ago

      That was their goal, but in the past couple years they seem to have given up on client-side-only ai. Once they let that go, it became next to impossible to claw back to client only… because as client side ai gets better so does server side, and people’s expectations scale up with server side. And everybody who this was a dealbreaker for left the room already.

      • WorldMaker 15 hours ago

        Apple thinks they can get a best-of-both-worlds approach with Private Cloud Compute. They believe they can secure private servers specialized to specific client devices in a way that the cloud compute effort is still "client-side" from a trust standpoint, but still able to use extra server-side resources (under lock and key).

        I don't know how close to that ideal they've achieved, but especially given this announcement is partly baked on an arrangement with Google that they are allowed to run Gemini on-device and in Private Cloud Compute, without using Google's more direct Gemini services/cloud, I'm excited that they are trying and I'm interested in how this plays out.

    • O5vYtytb 17 hours ago

      Well DRAM prices aren't going down soon so I see this as quite the push away from local inference.

  • robotresearcher 18 hours ago

    For some context with numbers, in mid-2024 Apple publicly described 3B parameter foundation models. Gemini 3 Pro is about 1T today.

    https://machinelearning.apple.com/research/apple-intelligenc...

    • gilgoomesh 16 hours ago

      That 3B model is a local model that eventually got built into macOS 26. Gemini 3 Pro is a frontier model (cloud). They're very different things.

  • jedimastert 3 hours ago

    > I'm oversimplifying but this effectively turns the iPhone into a dumb terminal for Google's brain

    I feel like people probably said this when Google became the default search engine for everyone...

  • segmondy a day ago

    10 years worth of cash? So all these Chinese labs that came out and did it for less than $1 billion must have 3 heads per developer, right?

    • andreyf 18 hours ago

      Rumor has it that they weren't trained "from scratch" the was US would, i.e. Chinese labs benefitted from government "procured" IP (the US $B models) in order to train their $M models. Also understand there to be real innovation in the many-MoE architecture on top of that. Would love to hear a more technical understanding from someone who does more than repeat rumors, though.

    • usef- 13 hours ago

      We don't really know how much it cost them. Plenty of reasons to doubt the numbers passed around and what it wasn't counting.

      (And even if you do believe it, they also aren't licensing the IP they're training on, unlike american firms who are now paying quite a lot for it)

    • 4fterd4rk 17 hours ago

      A lot of HN commentators are high on their own supply with regard to the AI bubble... when you realize that this stuff isn't actually that expensive the whole thing begins to quickly unravel.

  • dabockster a day ago

    It also lets them keep a lot of the legal issues regarding LLM development at arms length while still benefiting from them.

  • stronglikedan 21 hours ago

    > Seems like they are pivoting to becoming the premium "last mile" delivery network for someone else's intelligence.

    They have always been a premium "last mile" delivery network for someone else's intelligence, except that "intelligence" was always IP until now. They have always polished existing (i.e., not theirs) ideas and made them bulletproof and accessible to the masses. Seems like they intend to just do more of the same for AI "intelligence". And good for them, as it is their specialty and it works.

  • chatmasta 19 hours ago

    It’s also a bet that the capex cost for training future models will be much lower than it is today. Why invest in it today if they already have the moat and dominant edge platform (with a loyal customer base upgrading hardware on 2-3 year cycles) for deploying whatever future commoditized training or inference workloads emerge by the time this Google deal expires?

  • ysnp a day ago

    Could you elaborate a bit on why you've judged it as privacy theatre? I'm skeptical but uninformed, and I believe Mullvad are taking a similar approach.

    • greentea23 a day ago

      Mullvad is nothing like Apple. For apple devices: - need real email and real phone number to even boot the device - cannot disable telemetry - app store apps only, even though many key privacy preserving apps are not available - /etc/hosts are not your own, DNS control in general is extremely weak - VPN apps on idevices have artificial holes - can't change push notification provider - can only use webkit for browsers, which lacks many important privacy preserving capabilities - need to use an app you don't trust but want to sandbox it from your real information? Too bad, no way to do so. - the source code is closed so Apple can claim X but do Y, you have no proof that you are secure or private - without control of your OS you are subject to Apple complying with the government and pushing updates to serve them not you, which they are happy to do to make a buck

      Mullvad requires nothing but an envelope with cash in it and a hash code and stores nothing. Apple owns you.

      • Melatonic 21 hours ago

        Agreed on most points but you can setup a pretty solid device wide DNS provider using configuration profiles. Similar to how iOS can be enrolled in work corporate MDM - but under your control.

        Works great for me with NextDNS.

        Orion browser - while also based on WebKit - is also awesome and has great built in Adblock and supposedly privacy respecting ideals.

        • greentea23 19 hours ago

          Apple has records that you are installing that, probably putting you on a list.

          And it works until it's made illegal in your country and removed from the app store. You have no guarantees that anything that works today will work tomorrow with Apple.

          Apple is setting us up to be under a dictator's thumb one conversion at a time.

      • MrDarcy a day ago

        This comment confuses privacy with anonymity.

      • apparent 19 hours ago

        You do not need an email address to set up an iPhone, and you do not need an email address or phone number to set up an iPad/Mac.

        If you want to use the App Store on these devices, you do need to have an email address.

    • natch a day ago

      They transitioned from “nobody can read your data, not even Apple” to “Apple cannot read your data.” Think about what that change means. And even that is not always true.

      They also were deceptive about iCloud encryption where they claimed that nobody but you can read your iCloud data. But then it came out after all their fanfare that if you do iCloud backups Apple CAN read your data. But they aren’t in a hurry to retract the lie they promoted.

      Also if someone in another country messages you, if that country’s laws require that Apple provide the name, email, phone number, and content of the local users, guess what. Since they messaged you, now not only their name and information, but also your name and private information and message content is shared with that country’s government as well. By Apple. Do they tell you? No. Even if your own country respects privacy. Does Apple have a help article explaining this? No.

      • threatofrain 21 hours ago

        If you want to turn on full end-to-end encryption you can, if you want to share your pubkey so that people can't fake your identity on iMessage you can, and there's still a higher tier of security than that presumably for journalists and important people.

        It's something a smart niece or nephew could handle in terms of managing risk, but the implications could mean getting locked out of your device which you might've been using as the doorway to everything, and Apple cannot help you.

      • dpoloncsak 21 hours ago

        >Also if someone in another country messages you, if that country’s laws require that Apple provide the name

        I don't mean to sound like an Apple fanboy, but is this true just for SMS or iMessage as well? It's my understanding that for SMS, Apple is at the mercy of governments and service providers, while iMessage gives them some wiggle room.

        Ancedotal, but when my messages were subpoenaed, it was only the SMS messages. US citizen fwiw

      • richwater 19 hours ago

        You people will never be happy until the only messaging that exists is in a dusty basement and Richard Stallman is sleeping on a dirty futon.

    • drnick1 a day ago

      Because Apple makes privacy claims all the time, but all their software is closed source and it is very hard or impossible to verify any of their claims. Even if messages sent between iPhones are E2EE encrypted for example, the client apps and the operating system may be backdoored (and likely are).

      https://en.wikipedia.org/wiki/PRISM

    • tempodox a day ago

      The gov’t can force them to reveal any user’s data and slap them with a gag order so no one will ever know this happened.

  • derefr 14 hours ago

    > They simply do not have the TPU pods or the H100 clusters to train a frontier model like Gemini 2.5 or 3.0 from scratch without burning 10 years of cash flow.

    Why does Apple need to build its own training cluster to train a frontier model, anyway?

    Why couldn't the deal we're reading about have been "Apple pays Google $200bn to lease exclusive-use timeslots on Google's AI training cluster"?

    • m3kw9 14 hours ago

      That would be more expensive in the long run and Apple is all about long game

  • Melatonic 21 hours ago

    Personally also think it's very smart move - Google has TPUs and will do it more efficiently than anyone else.

    It also lets Apple stand by while the dust settles on who will out innovate in the AI war - they could easily enter the game on a big way much later on.

    • fuzzy_lumpkins 9 hours ago

      absolutely, right now they can avoid any risk but get benefits as they recollect themselves

  • hadlock a day ago

    Seems like the LLM landscape is still evolving, and training your own model provides no technical benefit as you can simply buy/lease one, without the overhead of additional eng staffing/datacenter build-out.

    I can see a future where LLM research stalls and stagnates, at which point the ROI on building/maintaining their own commodity LLM might become tolerable. Apple has had Siri as a product/feature and they've proven for the better part of a decade that voice assistants are not something they're willing to build a proficiency in. My wife still has an apple iPhone for at least a decade now, and I've heard her use Siri perhaps twice in that time.

  • ChildOfChaos a day ago

    The trouble is this seems to me like a short term fix, longer term, once the models are much better, Google can just lock out apple and take everything for themselves and leave Apple nowhere and even further behind.

    • raw_anon_1111 a day ago

      Of course there is going to be an abstraction layer - this is like Software Engineering 101.

      Google really could care less about Android being good. It is a client for Google search and Google services - just like the iPhone is a client for Google search and apps.

  • cluckindan 9 hours ago

    >Am I missing the elephant in the room?

    Everyone using Siri is going to have their personality data emulated and simulated as a ”digital twin” in some computing hell-hole.

  • [removed] a day ago
    [deleted]
  • haritha-j a day ago

    Agreed, especially since this is a competitive space with multiple players, with a high price of admission, and where your model is outdated in a year, so its not even capex as much as recurring expenditure. Far better to let someone else do all the hard work, and wait and see where things go. Maybe someday this'll be a core competency you want in-house, but when that day comes you can make that switch, just like with apple silicon.

  • goalieca 16 hours ago

    Apple sells consumer goods first and foremost. They likely don't see a return on investment through increased device or services sales to match the hundreds of billions that these large AI companies are throwing down every year.

  • semiquaver 21 hours ago

      > without burning 10 years of cash flow.
    
    Sorry to nitpick but Apple’s Free Cash Flow is 100B/yr. Training a model to power Siri would not cost more than a trillion dollars.
    • manquer 15 hours ago

      Of all the companies to survive a crash in AI unscathed, I would bet on Apple the most.

      They are only ones who do not have large debts off(or on) balance sheet or aggressive long term contracts with model providers and their product demand /cash flow is least dependent on the AI industry performance.

      They will still be affected by general economic downturn but not be impacted as deeply as AI charged companies in big tech.

  • sitzkrieg 12 hours ago

    the year is 2026, the top advertising company is in bed with the walled garden device specialists and the decision is celebrated

  • _joel a day ago

    > without burning 10 years of cash flow.

    Don't they have the highest market cap of any company in existence?

    • jayd16 a day ago

      You don't need to join every fight you see, even if you would do well.

    • fumblebee a day ago

      I believe both Nvidia and Google have higher market caps

    • turtlesdown11 a day ago

      They have the largest free cash flow (over $100 billion a year). Meta and Amazon have less than half that a year, and Microsoft/Nvidia are between $60b-70b per year. The statement reflects a poor understanding of their financials.

  • hashta 19 hours ago

    this also addresses something else ...

    apple to some users "are you leaving for android because of their ai assistant? don’t leave we are bringing it to iphone"

  • PunchyHamster 17 hours ago

    > To me, this deal is about the bill of materials for intelligence. Apple admitted that the cost of training SOTA models is a capex heavy-lift they don't want to own. Seems like they are pivoting to becoming the premium "last mile" delivery network for someone else's intelligence. Am I missing the elephant in the room?

    Probably not missing the elephant. They certainly have the money to invest and they do like vertical integration but putting massive investment in bubble that can pop or flatline at any point seems pointless if they can just pay to use current best and in future they can just switch to something cheaper or buy some of the smaller AI companies that survive the purge.

    Given how much AI capable their hardware is they might just move most of it locally too

  • fooblaster a day ago

    calling neural engine the best is pretty silly. the best perhaps of what is uniformly a failed class of ip blocks - mobile inference NPU hardware. edge inference on apple is dominated by cpus and metal, which don't use their NPU.

  • SergeAx 14 hours ago

    > without burning 10 years of cash flow

    AAPL has approximately $35 billion of cash equivalents on hand. What other use may they have for this trove? Buy back more stocks?

  • caycep 15 hours ago

    Honestly, I'm relieved...it's not really in their DNA and not pivotal to their success; why pivot the company into a U turn into a market that's vague defined and potentially algorithmically limited?

  • whereismyacc a day ago

    best inference silicon in the world generally or specialized to smaller models/edge?

    • properbrew a day ago

      Not even an Apple fan, but from what I've been testing with for my dev use case (only up to 14b) it absolutely rocks for general models.

      • whereismyacc a day ago

        That I can absolutely believe but the big competition is in enterprise gpt-5-size models.

  • kernal 17 hours ago

    >Apple has the best edge inference silicon in the world (neural engine),

    Can you cite this claim? The Qualcomm Hexagon NPU seems to be superior in the benchmarks I've seen.

  • baxuz 21 hours ago

    > bill of materials for intelligence

    There is no intelligence

  • scotty79 a day ago

    > without burning 10 years of cash flow.

    Wasn't Apple sitting on a pile of cash and having no good ideas what to spend it on?

    • ceejayoz a day ago

      That doesn't make lighting it on fire a great option.

    • internetter a day ago

      Perhaps spending it on inference that will be obsoleted in 6 months by the next model is not a good idea either.

      Edit: especially given that Apple doesn’t do b2b so all the spend would be just to make consumer products

    • turtlesdown11 a day ago

      The cash pile is gone, they have been active in share repurchase.

      They still generate about ~$100 billion in free cash per year, that is plowed into the buybacks.

      They could spend more cash than every other industry competitor. It's ludicrous to say that they would have to burn 10 years of cash flow on trivial (relative) investment in model development and training. That statement reflects a poor understanding of Apple's cash flow.

  • mschuster91 14 hours ago

    > Am I missing the elephant in the room?

    Apple is flush with cash and other assets, they have always been. They most likely plan to ride out the AI boom with Google's models and buy up scraps for pennies on the dollar once the bubble pops and a bunch of the startups go bust.

    It wouldn't be the first time they went for full vertical integration.

runjake 21 hours ago

Apple has seemingly confirmed that the Gemini models will run under their Private Cloud Compute and so presumably Google would not have access to Siri data.

https://daringfireball.net/linked/2026/01/12/apple-google-fo...

  • cpeterso 20 hours ago

    Neither Apple's nor Google's announcement says Siri will use Gemini models. Both announcements say, word for word, "Google’s technology provides the most capable foundation for Apple Foundation Models". I don't know what that means, but Apple and Google's marketing teams must have crafted that awkward wording carefully to satisfy some contractual nuance.

    • runjake 19 hours ago

      Direct quote from Google themselves:

      "Apple Intelligence will continue to run on Apple devices and Private Cloud Compute, while maintaining Apple's industry-leading privacy standards."

    • Workaccount2 18 hours ago

      Apple likely wants to post-train a per-trained model, probably along with some of Google's heavily NDA'ed training techniques too.

    • w10-1 17 hours ago

      > "Google’s technology provides the most capable foundation for Apple Foundation Models"

      Beyond Siri, Apple Foundation Models are available as API; will Google's technologies thus also be available as API? Will Apple reduce its own investment in building out the Foundation models?

    • Ninjinka 20 hours ago

      Check again: https://x.com/NewsFromGoogle/status/2010760810751017017?s=20

      "These models will help power future Apple Intelligence features, including a more personalized Siri coming this year."

      • cpeterso 14 hours ago

        I see what you mean, though I think “these models” refers to Apple’s Foundation Models, which “will be based on Google's Gemini models and cloud technology.” I guess it depends on wrist “based” means.

    • baxtr 18 hours ago

      Mostly likely the wording was crafted by an artificially intelligent entity.

cmiles8 17 minutes ago

Makes sense given the search alliance already in place.

Amazon/AWS was trying to push its partnership with Apple hard once that was revealed, including vague references to doing AI things, but AWS is just way to far behind at this point so looks like they lost out here to Google/GCP.

quitit a day ago

This is a bit of a layer cake:

1. The first issue is that there is significant momentum in calling Siri bad, so even if Apple released a higher quality version it will still be labelled bad. It can enhance the user's life and make their device easier to use, but the overall press will be cherrypicked examples where it did something silly.

2. Basing Siri on Google's Gemini can help to alleviate some of that bad press, since a non-zero share of that doomer commentary comes from brand-loyalists and astroturfing.

3. The final issue is that on-device Siri will never perform like server-based ChatGPT. So in a way it's already going to disappoint some users who don't realise that running something on mobile device hardware is going to have compromises which aren't present on a server farm. To help illustrate that point: We even have the likes of John Gruber making stony-faced comparisons between Apple's on-device image generator toy (one that produces about an image per second) versus OpenAI's server farm-based image generator which makes a single image in about 1-2 minutes. So if a long-running tech blogger can't find charity in those technical limitations, I don't expect users to.

  • JohnMakin a day ago

    Siri is objectively bad though. It isn't some vendetta. I am disabled and there are at least 50 different things that I'd love siri to do that should be dead simple, yet it cannot. My favorite one was when I suffered a small but not serious fall, decided to test whether siri could be alerted to call 9-11 while being less than 6 feet away from me, absolutely could not understand let alone execute my request. It's a lot of stuff like this. Its core functionality often just does not work.

    > The final issue is that on-device Siri will never perform like server-based ChatGPT. So in a way it's already going to disappoint some users who don't realise that running something on mobile device hardware is going to have compromises which aren't present on a server farm.

    For many years, siri requests were sent to an external server. It still sucked.

    • yreg 13 hours ago

      > Hey Siri, call me an ambulance!

      > Alright, from now on I will call you Anne Ambulance.

    • margalabargala a day ago

      I don't think the parent said that Siri wasn't bad, on the contrary it sounds like they agree.

      Their point is that if Apple totally scraps the current, bad, product called "Siri" and replaces it with an entirely different, much better product that is also named "Siri" but shares nothing but the name, people's perceptions of the current bad Siri will taint their impressions of the new one.

      • quitit 19 hours ago

        It's pretty clear they tried their best to miss or reinterpret the points I made so they could talk about something else.

    • Workaccount2 18 hours ago

      I'd be skeptical about even new LLM siri being able to dial 911.

      These models tend to have a "mind of their own", and I can totally, absolutely, see a current SOTA LLM convincing itself it needs to call 911 because you asked it how to disinfect a cut.

      • array_key_first 18 hours ago

        Ideally you have a layer before the LLM that filters out stuff the phone can do without an LLM. The LLM probably shouldn't even have the power to call 911, that should be a layer lower. And probably you don't want to send simple queries like "call XYZ" to the cloud, best to just do it locally.

  • apparent 19 hours ago

    There are many people who lament that Siri sucks but would be happy to admit if/when this changes. Even if it goes from super shitty (as evidenced by randomly calling people I have never called/texted when I ask it to call my wife) to "pretty good" I will be the first to admit that it is better. I look forward to it getting better and being able to use it more often.

  • [removed] 16 hours ago
    [deleted]
  • mucle6 a day ago

    re 3: I doubt Google is going to hand over the weights to Apple to put on device.

    • MaysonL 21 hours ago

      They wouldn’t fit.

      • WorldMaker 15 hours ago

        Not with that attitude.

        Apple and Google have said that Private Cloud Compute will be involved as well, which Apple is trying to build a mystique of "on-device-like" trust. (Which yes, if Private Cloud Compute is involved and is secure in the ways that Apple says it is does presumably imply that the announced deal with Google includes selling Apple the complete model weights.)

    • quitit 19 hours ago

      Nor was such a thing implied. The information in the various news articles about it also don't make that claim.

d4rkp4ttern 14 hours ago

Forget Siri, I have a much lower bar — I’ll be happy if they just improve iOS typing corrections/completions, which often don’t make any sense given the rest of the sentence.

  • aurareturn 7 hours ago

    Not just rest of the sentence. In my opinion, autocorrect desperately needs to take into account the context of the current screen.

    There are many times I want to type the same word that is already on the app screen but it autocorrects me to something completely different.

    • theshrike79 5 hours ago

      And they could add predictive text to other languages too, it's not rocket science.

      The current system suggests words I have never used, will never use and have never heard before instead of the obvious choice.

  • al_borland 3 hours ago

    I’ve been debating turning auto-correct off completely. However, the first iPhone had it, so I’m guessing I want some level of it. I just don’t understand on v1 was better than what we have 18 years later.

  • intrasight 14 hours ago

    It's the same core issue which is basically that their software stacked sort of sucks. They should definitely wipe the slate clean when it comes to anything related to language, and that includes typing, text to speech, speech to text, agents, etc.

    They have the time and the money and the customers, so I'm confident they will accomplish great things.

gnabgib a day ago

Related: Apple nears $1B Google deal for custom Gemini model to power Siri (71 points, 2 months ago, 47 comments) https://news.ycombinator.com/item?id=45826975

  • johnthuss a day ago

    The biggest NEW thing here is that this isn't white-labeled. Apple is officially acknowledging Google as the model that will be powering Siri. That explicit acknowledgment is a pretty big deal. It will make it harder for Apple to switch to its own models later on.

    • mdasen a day ago

      Where does it say that it won't be white-labeled?

      Yes, Apple is acknowledging that Google's Gemini will be powering Siri and that is a big deal, but are they going to be acknowledging it in the product or is this just an acknowledgment to investors?

      Apple doesn't hide where many of their components come from, but that doesn't mean that those brands are credited in the product. There's no "fab by TSMC" or "camera sensors by Sony" or "display by Samsung" on an iPhone box.

      It's possible that Apple will credit Gemini within the UI, but that isn't contained in the article or video. If Apple uses a Gemini-based model anonymously, it would be easy to switch away from it in the future - just as Apple had used both Samsung and TSMC fabs, or how Apple has used both Samsung and Japan Display. Heck, we know that Apple has bought cloud services from AWS and Google, but we don't have "iCloud by AWS and GCP."

      Yes, this is a more public announcement than Apple's display and camera part suppliers, but those aren't really hidden. Apple's dealings with Qualcomm have been extremely public. Apple's use of TSMC is extremely public. To me, this is Apple saying "hey CNBC/investors, we've settled on using Gemini to get next-gen Siri happening so you all can feel safe that we aren't rudderless on next-gen Siri."

      • a_paddy a day ago

        Apple won't take the risk of being blamed for AI answers being incorrect. They will attribute Google/Gemini so users know how to be mad at if it doesn't work as expected.

      • HarHarVeryFunny 21 hours ago

        If I were Goodle, I'd offer Apple a very significant discount to have visible branding of "powered by Gemini".

    • Angostura a day ago

      I don't see why - iOS originally shipped with Google Maps as standard, for example. Macs shipped with Internet Explorer as standard before Safari existed

      • johnthuss a day ago

        The Google Maps situation is a great example of why this will be hard. When Apple switched to their own maps it was a huge failure resulting in a rare public apology from the company. In order to switch you have to be able to do absolutely everything that the previous solution offered without loss of quality. Given Google's competence in AI development that will be a high bar to meet.

      • rrrrrrrrrrrryan 7 hours ago

        Apple ultimately developed their own map application specifically because Google was unwilling to remove the Google logo from the Google Maps app, no matter the price.

        It'll absolutely be interesting to see if "Google" or "Gemini" appear anywhere in the new Siri UI.

        • al_borland 2 hours ago

          As someone who hasn’t used Google Search in several years, I will be upset and less inclined to use the AI if it’s kicking me out to Google search result pages to show results. This is what I fear. Some of this already happens with Siri and Apple Intelligence today. I’m sure Google would love to see even more of it, to serve up ads and take advantage of their new revenue streams in agentic shopping.

    • charliebwrites a day ago

      Why so?

      Apple explicitly acknowledged that they were using OpenAI’s GPT models before this, and now they’re quite easily switching to Google’s Gemini

      • johnthuss a day ago

        The ChatGPT integration was heavily gated by Apple and required explicit opt-in. That won't be the case with the Gemini integration. Apple wants this to just work. The privacy concerns will be mitigated because Apple will be hosting this model themselves in their Private Cloud Compute. This will be a much more tightly integrated solution than ChatGPT was.

      • hu3 a day ago

        I guess the question is, when are they going to use their own model?

        Surely research money is not the problem. Can't be lack of competence either, I think.

    • dewey a day ago

      Don't think that's an especially big deal, they've always included third party data in Siri or the OS which is usually credited (Example: Maps with Foursquare or TomTom, Flight information from FlightAware, Weather data and many more).

    • insin a day ago

      They can also put "Google" in the forever-necessary disclaimer

      Google AI can make mistakes

  • dylan604 a day ago

    Is this another one of those AI deals where no real money changes hands? In this case, doesn't this just offset the fee Google pays Apple for having their search as the default on Apple devices?

    • asadotzler 18 hours ago

      I'll wager the accounting for the two contracts is separate. There may be stipulations that connect the two, but the payment from Google to Apple of $20B+/yr is a long-established contract (set of contracts, actually0 that Apple would not jeopardize for the relatively small Google to Apple $1B/yr contract, one still unproven and which may not stand the test of time.

      So, yes, practically speaking, the Apple to Google payment offsets a tiny fraction of the Google to Apple payment, but real money will change hands for each and very likely separately.

    • aoeusnth1 a day ago

      So changing cash flows (fee money) isn't real enough now?

asadm 21 hours ago

OpenAI had it, they had the foot in the door with their integration last year with Siri. But they dropped that ball and many other balls.

  • czscout 21 hours ago

    Yeah, I was really expecting them to just continue the partnership that Apple announced when the iPhone 16/iOS 18 came out, but I suppose it's been pretty much radio silence on both fronts since then. Although the established stability and good enough-ness that Google offers with Gemini are probably more than enough reason for Apple to pivot to them as a model supplier instead.

  • toasterlovin 19 hours ago

    I'm sure hiring Jony Ive to design hardware for them didn't help.

  • jquery 20 hours ago

    Yeah. Super disappointing. I may end up switching to Gemini entirely at this rate.

mitchitized an hour ago

What I want to know is the privacy impact of this partnership. I see terms like "Apple will be running Google's models on their infrastructure" but that definitely is not enough detail for me to know where my data is going.

Any details on privacy and data sharing surfaced yet?

ggm 16 hours ago

I think this is a good move for Apple. It avoids tying them directly to internalised beliefs in their own AI model, it avoids all the capex around building out an AI engine and associated DC, it reduces risk, and it keeps google in a relationship under contract which google will value, and probably value enough to think hard about stupid legal games regarding Playstore and walled gardens.

Apple plainly doesn't believe in the uplift and impending AGI doom. Nor do they believe there's no value in AI services. They just think for NOW at least they can buy in better than they can own.

But based on Apples VLSI longterm vision, on their other behaviours in times past with IPR in any space, they will ultimately take ownership.

  • bigyabai 12 hours ago

    > they will ultimately take ownership.

    How? People have been saying this since CoreML dropped nine years ago. Apple is no closer to revamping Siri or rebuking CUDA than they were back then.

    • ggm 9 hours ago

      When the cost of deployment drops. And, when their own chip designs are profitable, they'll take the capital hit. Until then, as long as the income split for Gemini backed siri isn't terrible they'll stay an outsource. If they persuade Google to deploy Apple chips into the service I'd say an in-house is within sight.

      Apple private relay runs in cloudflare and fastly and I believe one other major. They certainly can and do run services for a long time with partners.

sublimefire 6 hours ago

Personally for me, who is bought into the Apple ecosystem this is worrying. I am aware how PCC is supposed to work (which is the likely target platform) but the deal with Google of all the companies sends bad signal to consumers who are privacy focussed. If such a feature will be baked in without a way to switch it off, the next device will not be iphone or macbook or ipad.

  • romanovcode 6 hours ago

    You could disable Siri since the very beginning, why would all of a sudden it would not have the toggle to disable.

elzbardico a day ago

Models are becoming commodities, and their economy doesn't justify the billions required to train a SOTA model. Apple just recognized that.

  • aurareturn 7 hours ago

    Models are becoming less like commodities. They're differentiating with strengths and weaknesses. When Chinese labs gain more traction, they will stop releasing their models for free. At that point, everyone who wants SOTA models will have to pay.

    • elzbardico 43 minutes ago

      Having to pay has nothing to do with a good being a commodity. I have to pay for sugar, but there is no big difference between brands that justify any of them commanding a monopoly rent, so, sugar is a commodity. The same is more or less true of LLMs right now and unless someone comes up with a new paradigm beyond the transformers architecture, there is no reason to believe this commodification trend is going to be reversed.

      Most of the differentiation is happening on the application/agent layer. Like Coworker.

      The rest of it, is happening on post-training. Incremental changes.

      We are not talking about EUV lithography here. There are no substantial moths of years of pure and applied research protected by patents.

      • aurareturn 42 minutes ago

        Normal software has way less moat than SOTA labs.

        SOTA AI models can have different architectures, vastly different compute in training, different ways of inferencing, different input data, different RL, and different systems around the model. Not to mention the significant personal user data that OpenAI is collecting.

        Saying SOTA AI models are like sugar is insane.

  • [removed] 18 hours ago
    [deleted]
jmacd a day ago

This is one of those announcements that actually just excites me as a consumer. We give our children HomePods as their first device when they turn 8 years old (Apple Watch at 10 years, laptop at 12) and in the 6 years I have been buying them, they have not improved one ounce. My kids would like to listen to podcasts, get information, etc. All stuff that a voice conversation with Chatgpt or Gemini can do today, but Siri isn't just useless-- it's actually quite frustrating!

  • CephalopodMD 11 hours ago

    Not exactly the same, but kinda: my gen 1 Google Home just got Gemini and it finally delivers on the promise of like 10 years ago! Brought new life to the thing beyond playing music, setting timers, and occasionally asking really basic questions

  • aixpert 8 hours ago

    it's not going to help them. For Siri to be really useful it wouldn't need deep system integration and an external model is not going to provide that. People don't believe me when I said it about Apple Intelligence with open AI

  • 46493168 a day ago

    It’s absolutely insane that you can’t say “Siri, play my audiobook” and it play the last audiobook you listened to. Like, come on.

    • http-teapot 21 hours ago

      Or when you are driving, someone sends a yes-no question where the answer is no.

      Siri: Would you like to answer?

      Me: Yes

      Siri: ...

      Me: No + more words

      Siri: Ok (shuts off)

  • layer8 a day ago

    It remains to be seen what the existing HomePods will support. There’s been a HomePod hardware update in the pipeline for quite some time, and it appears like they are waiting for the new Siri to be ready.

  • knallfrosch a day ago

    Siri still can't play an Apple Music album when there is a song of the same name.

    Even "Play the album XY" leads to Siri only playing the single song. It's hilariously bad.

    • billti 18 hours ago

      Or the even more frustrating:

      Me: "Hey Siri, play <well known hit song from a studio album that sold 100m copies"

      Siri: "OK, here's <correct song but a live version nobody ever listens to, or some equally obscure remix>"

      Being these things are at their core probability machines, ... How? Why?

      • troad 15 hours ago

        > Being these things are at their core probability machines, ... How? Why?

        Is Siri a probability machine? I didn't think it was an LLM at all right now? I thought it was some horrendous tree of switch statements, hence the difficulty of improving it.

        Apple search is comically bad, though. Type in some common feature or app, and it will yield the most obscure header file inside the build deps directory of some Xcode project you forgot existed.

apitman 19 hours ago

> After careful evaluation, we determined that Google’s technology provides the most capable foundation for Apple Foundation Models

Sounds like Apple Foundation Models aren't exactly foundational.

zeras a day ago

This is actually a smart and common sense move by Apple.

The non-hardware AI industry is currently in an R&D race to establish and maintain marketshare, but with Apple's existing iPhone, iPad and Mac ecosystem they already have a market share they control so they can wait until the AI market stabilizes before investing heavily in their own solutions.

For now, Apple can partner with solid AI providers to provide AI services and benefits to their customers in the short term and then later on they can acquire established AI companies to jumpstart their own AI platform once AI technology reaches more long term consistency and standardization.

sebastianconcpt 5 hours ago

Pity they don't have their own thing at that level. They had a great start introducing Siri then totally missed the train.

  • sebastianconcpt 5 hours ago

    Someone inside and up there neglected the "a bicycle for the mind" part of the vision.

noduerme 7 hours ago

I thought it was interesting that a Google flack stressed that the model would run on Apple's compute, and seemed to imply it might even run on-device. Allegedly this was said to allay the (expected) privacy concerns of Apple users who wouldn't want their Siri convos shared with Google.

But I saw something else in that statement. Is there going to be some quantized version of Gemini tailored to run on-device on an M4? If so, that would catapult Apple into an entirely new category merging consumer hardware with frontier models.

  • sublimefire 6 hours ago

    You can already run quantized models without much friction, people also have dedicated apps for that. It changes very little for people because they everyone who wanted to do it already solved it and those who do not they dont care. It is marginal gain from consumer, a feature to brag about for apple, big gain for google. Users also would need to change existing habits which is undoubtedly hard to do.

kenjackson a day ago

Somewhat surprising. AI is such a core part of the experience. It feels like a mistake to outsource it to arguably your biggest competitor.

  • crazygringo a day ago

    It's clear they don't have the in-house expertise to do it themselves. They aren't an AI player. So it's not a mistake, just a necessity.

    Maybe someday they'll build their own, the way they eventually replaced Google Maps with Apple Maps. But I think they recognize that that will be years away.

    • al_borland 2 hours ago

      Apple has been using ML in their products for years, to the point that they dedicated parts of their custom silicon for it before the LLM craze. They clearly have some in-house ML talent, but I suppose LLM talent may be a different question.

      I’m wondering if this is a way to shift blame for issues. It was mentioned in an interview that what they built internally wasn’t good enough, presumably due to hallucinations… but every AI does that. They know customers have a low tolerance for mistakes and any issues will quickly become a meme (see the Apple Maps launch). If the technology is inherently flawed, where it will never live up to their standards, if they outsource it, they can point to Google as the source of the failings. If things get better down the road and they can improve by pivoting away from Google, they’ll look better and it will make Google look bad. This could be the long game.

      They may also save a fortune in training their own models, if they don’t plan to directly try to monetize the AI, and simply have it as a value add for existing customers. Not to mention staying out of hot water related to stealing art for training data, as a company heavily used by artists.

    • kenjackson a day ago

      I agree that they don't appear poised to do it themselves. But why not work with Meta or OpenAI (maybe a bit more questionable with MS) or some other player, rather than Google?

      • crazygringo a day ago

        The optics of working with Meta make it a non-starter. Apple symbolizes privacy, Meta the opposite.

        With OpenAI, will it even be around 3 years from now, without going bankrupt? What will its ownership structure look like? Plus, as you say, the MS aspect.

        So why not Google? It's very common for large corporations to compete in some areas and cooperate in others.

        • anonymouskimmer 21 hours ago

          SORRY TO EVERYONE ELSE FOR GOING OFF TOPIC.

          I didn't see you 41 day old reply to me until it was too late to comment on it. So here's a sarcastic "thanks for ignoring what I wrote" and telling me that exactly what I was complaining about is the solution to the problem I was complaining about.

          https://news.ycombinator.com/item?id=46114935

          1) I told you my household can't use Target or Amazon for unscented products, without costly remediation measures, BECAUSE EVEN SCENT-FREE ITEMS COME SMELLING FROM PERFUME CROSS-CONTAMINATION THANKS TO CLEANING, STORAGE, AND TRANSPORTATION CONDITIONS. SOMETIMES REALLY BADLY.

          FFS. If you are going to respond, first read.

          I also mentioned something other than "government intervention to dictate how products are made" as a solution to this issue, namely adequate segregation between perfumed and non-perfumed products.

          And I care less about my wallet than I do about my time and actual ability to acquire products that are either truly scent free, or like yesteryear, don't have everlasting fragrance fixatives.

          For people in my position, which make up a small percentage of the population (that still numbers in the millions), the free market has failed. We are a specialized niche that trades tips on how to make things tolerable.

          SORRY TO EVERYONE ELSE FOR GOING OFF TOPIC.

    • WithinReason a day ago

      Apple has surprisingly good quality AI papers, a lot of work on bridging research and product.

      • hu3 10 hours ago

        hey Siri, search Apple papers about AI.

  • deergomoo 20 hours ago

    > AI is such a core part of the experience

    For who? Regular people are quite famously not clamouring for more AI features in software. A Siri that is not so stupendously dumb would be nice, but I doubt it would even be a consideration for the vast majority of people choosing a phone.

    • ares623 15 hours ago

      For the shareholders, the only ones that really matter

  • gregoriol a day ago

    They could use it like Google Search, not as the first thing the user sees, but as a fallback

  • asadotzler 18 hours ago

    Web search is a core part of browsing and Apple is Google's biggest competitor in browsers. Google is paying Apple about 25x for integrating Google Search in Safari as Apple will be paying Google to integrate Google's LLMs into Siri. If you think depending on your competitor is a problem, you should really look into web search where all the real money is today.

  • xtoilette a day ago

    How much of the two revenue streams overlap in reality?