Danieru 4 days ago

No one appears to have mentioned the important meta game going on: Intel bidding as a credible alternative supplier.

For Intel, by bidding they get to undercut AMD's profits.

For Sony, they get a credible alternative which they can pretend would be a viable choice. Thus forcing a slightly better deal from AMD.

We saw similar articles related to the Switch 2. That time it was AMD acting as spoiler to Nvidia. Nvidia reportedly got the contract. That time too we got news articles lamenting this loss for AMD.

As a gamedev I have a different perspective: Sony and Nintendo would be fools to give up backwards compatibility just for savings on chips.

Switching vendors does not just invalidate old games compatibility, it also requires retooling for their internal libraries. Console games, outside small or open source engines, use proprietary graphics api. Those apis are tied to the hardware. With this coming generation from Nintendo, and the "current gen" from Sony and Xbox they've been able to mostly reuse much of their software investment. I'd case more but this is obviously nda, other devs should be able to confirm.

Thus I don't think AMD for switch2 or Intel for ps6 was ever a credible path. Their bids existed to keep the existing vendor from getting overly greedy and ruining the parade for everyone. This is important, famously the original Xbox got hamstrung in the market by Nvidia's greed and refusal to lower prices as costs went down.

  • senkora 4 days ago

    +1. An important non-obvious detail for AMD is that they (at least in the past, I assume for this as well) have kept the instruction timings similar from generation to generation of consoles.

    Different x86 micro-architectures benefit from writing the machine code in slightly different ways. Games are highly optimized to the specific micro-architecture of the console, so keeping that stable helps game developers optimize for the console. If you suddenly changed the micro-architecture (if switching to Intel), then old games could suddenly become janky and slow even though both systems are x86.

    (This would only matter if you were pushing performance to the edge, which is why it rarely matters for general software development, but console game dev pushes to the edge)

    So it isn't just the graphics APIs that would change going from AMD to Intel, but the CPU performance as well.

    • deaddodo 4 days ago

      > Different x86 micro-architectures benefit from writing the machine code in slightly different ways. Games are highly optimized to the specific micro-architecture of the console, so keeping that stable helps game developers optimize for the console.

      While that can be true, very few gamedev companies these days optimize to that degree. They almost all use off-the-shelf middleware and game engines that are built to support all of the platforms. The companies that do go through that effort tend to have very notable releases.

      Nobody is hand-tuning Assembler code these days to fit into tight instruction windows. At least, not outside of some very specific logic fragments. Instead they're all writing generic interrupt-based logic. Which is fine, as that's what the newer CPUs expect and optimize for internally.

      In addition, the difference in the Zen generation gap is as different as switching to Intel. We're talking fairly different cache coherency, memory hierarchies, CCX methodologies, micro-op and instruction timings, iGPU configurations, etc.

      That all being said, AMD was going to beat Intel regardless because of established business relationships and their current internal struggles (both business-wise and R&D) making it fairly difficult for them to provide an equivalent alternative.

      • soganess 3 days ago

        Asking this as an open ended (if leading) question: I assume enough people are doing it otherwise PS5 Pro makes no sense... Right?

        They (AMD/Sony) shoehorned the RDNA 3/3.5 GPU architecture onto an older Zen 2 core, with a different process node, because... they felt like making a frankenAPU? Especially since the APUs are usually monolithic (vs chiplet) in design and share a memory controller. Surely it would have been easier/cheaper to put in 8 zen 4c/5c cores and call it a day.

        I'm pretty sure I'm just missing something obvious...

      • MichaelZuo 3 days ago

        How would you explain cross PS5/PC releases being much more efficient on the PS5?

        e.g. Horizon Forbidden West needing a much better GPU on PC to run at the same level of fidelity as the PS5.

        If not for special tuning specific to the PS5’s differences.

        (I can imagine Windows bloat and other junk requiring an additional 10% to 20%, but not 30% to 50%.)

      • HelloNurse 3 days ago

        And, more simply, Moore's Law should ensure that in a next-generation console with a new microprocessor architecture slowdown in some instructions and memory access patterns is compensated by general speedup, limiting performance regressions to terribly unfortunate cases (which should be unlikely and so obvious that they are mitigated).

    • mikepavone 4 days ago

      > An important non-obvious detail for AMD is that they (at least in the past, I assume for this as well) have kept the instruction timings similar from generation to generation of consoles.

      What? The Jaguar-based CPU in the PS4 has both a much lower clock and substantially lower IPC than the Zen 2 based one in the PS5. The timings are not remotely the same and the micro-architectures are quite different. Jaguar was an evolution of the Bobcat core which was AMD's answer to the Intel Atom at the time (i.e. low cost and low-power, though it was at least an out-of-order core unlike contemporary Atoms).

      Going from GCN to RDNA on the GPU side is also a pretty significant architectural change, though definitely much less than the going from AMD to Intel would be.

      • senkora 3 days ago

        I did some more research and I was wrong.

        My source was an AMD tech talk from years ago where they mentioned keeping instruction timings the same for backwards compatibility reasons.

        I believe they were talking about this for the XBox One X: https://en.wikichip.org/wiki/microsoft/scorpio_engine#Overvi... (and a similar chip for the PS4 Pro)

        So basically, they upgraded and lightly enhanced the Jaguar architecture, shrunk the process (28nm -> 16nm), but otherwise kept it the same. AMD Zen was released around this time and was far superior but they decided to stick with Jaguar in order to make sure that instruction timings were kept the same.

        I guess that they didn't want two hardware revisions of the same console generation running on different micro-architectures, but they were okay switching the micro-architecture for the next console generation.

    • jheriko 4 days ago

      you clearly haven't played a modern game :P

      cpu timings taken care around by developers is 10-15 years out of date. most of them these days dont even know what a dot product is, how to find the distance to a point or a straight line in-between two... and the people they rely on to do this for them make horrendous meals of it.

      but yeah, sure, cpu instruction timings matter.

      • DaoVeles 3 days ago

        I was about to say. I bailed out of the industry just as the Xbox One/Ps4 was coming in. Even with the 360/Ps3, it was considered wise to try and steer clear of that kind of low level stuff just for ones sanity. When the X1/Ps4 came in, it was completely abandoned, turns out x86 compilers combined with OoO execution just made that kind of tinkering not only nearly pointless but sometime actually hurt performance.

        Nowadays,I suspect it is almost entirely in the hands of the compilers, the API's and the base OS to figure out the gritty details.

        • xgkickt 3 days ago

          There are still manual optimizations that can be done (non-temporal writes where appropriate for example), but nothing like the painstaking removal of Load-Hit-Stores and cache control of the 360/PS3 era.

      • Meganet 3 days ago

        The new chip will be relevant faster. I would bet that bandwidth between certain components is a lot more critical. Or NUNA or bandwidth between cores.

        Im surprised that cpu instruction latency is mentioned before other

    • lxgr 3 days ago

      Given the size of such a contract, wouldn't it be reasonable for Sony to just request equal or better instruction latency for everything relevant from the old CPU?

  • ksec 4 days ago

    Adding a bit more context.

    Nvidia got the bid for Switch when they were basically dumping those unwanted Tegra to Nintendo for an incredibly low price.

    Xbox and Playstation dont earn AMD much profits at all. AMD had this Custom Processor segment to barely keep them surviving, people may forget AMD was only worth ~$3B market cap in 2016. They are now at ~$250B.

    On the subject of software compatibility though, one thing I got it wrong was my prediction of having AAA titles on Xbox and PS would have helped AMD's market share on PC, given those titles are already optimised on Xbox and PS anyway. That didn't happen at all. And Nvidia continue to dominate.

    • elzbardico 3 days ago

      Sometimes a low margin business is all you need and have to keep the lights on, don't hemorrhage too much people and stay afloat until you get better winds.

      Traditional MBA thinking sometimes is too short sighted. For example, PCs might not have been a Cash Cow for IBM, but the Thinkpad brand, the distributor relationships and the customer may had helped IBM more than the cash out selling this business to Lenovo. Maybe having a healthy bridge head with a popular brand of laptops could have helped IBM coming up with some innovative way of selling the overhyped Watson.

      The same with AMD and videogames, it paid the bills, paid salaries and left a little profit on the table to be invested. Probably it helped them bridge from their hell days to what they are today.

      There's a lot of intangibles and hidden symmetries, serendipitous opportunities that are frequently overlooked by our bean-counting master race overlords.

    • sangnoir 3 days ago

      > Xbox and Playstation dont earn AMD much profits at all

      It doesn't cost them much either. Lisa Su, in an interview that was posted to HN a few months ago, said it is a deliberate strategy to repackage IP AMD has already developed. They are willing to pull designs from the shelf and customize it to meet partners needs. Having a long tail of products adds up, and sets you up to get first dibs on higher margin partnerships in the future.

    • derstander 3 days ago

      > Nvidia got the bid for Switch when they were basically dumping those unwanted Tegra to Nintendo for an incredibly low price.

      This seems pretty well aligned with Gunpei Yokoi’s strategy of “Lateral Thinking [with] Withered Technology”. It worked out pretty well for Nintendo in the past (e.g., Gameboy) and seems to be working out with the Switch. Even though he has passed, his Wikipedia page alleges that this philosophy has been passed on to others at Nintendo.

      • lynguist 3 days ago

        > Withered technology

        At the time of its release the Nintendo Switch’s CPU was only a single generation behind the latest offering by ARM; and its GPU was by far the most powerful mobile GPU available. It doesn’t hold true for Switch.

        What happened is that mobile compute has advanced tremendously since 2017 and Switch is stuck on technology that was leading in early 2017.

        • pjmlp 2 days ago

          While providing marvelous gaming experiences, faster polygons doesn't equate better games, and is specially an issue in latest gen PlayStation and XBox where many games with great graphics have lousy gameplay experience at high prices.

    • DaoVeles 3 days ago

      A few of the Playstation titles that made their way to PC do seem to have a little home field advantage on AMD chips, but not enough to sway people over to them.

    • lupusreal 3 days ago

      > having AAA titles on Xbox and PS would have helped AMD's market share on PC, given those titles are already optimised on Xbox and PS anyway. That didn't happen at all. And Nvidia continue to dominate.

      My impression is that console ports have insufficient popularity with PC gamers for them to alter their hardware purchasing habits for those games.

  • lxgr 3 days ago

    > Sony and Nintendo would be fools to give up backwards compatibility just for savings on chips.

    But would they really?

    Staying on x86-64 would take care of CPU compatibility (unless there's some exotic AMD-only instruction set extension heavily used by PS4/5 games), and a GPU emulation stack seems at least somewhat plausible.

    Sony has pulled this off multiple times before with seemingly completely incompatible architectures:

    The PS2 came with the PS1 CPU (repurposed as an IO controller, but fully available for previous-gen games) and emulated the GPU on its own. The PS3 did the reverse in its second iteration (i.e. it included the PS2's GPU and emulated the CPU). The PS Vita's SoC had the PSP MIPS CPU included on-die, which in turn is similar enough to the PS1's to allow running those games too.

    • DSMan195276 3 days ago

      For GPU emulation, I'm not super knowledgeable but I would think the shaders are a big issue, older systems don't have that problem. Console games come with precompiled shaders and you won't be able to reuse those between AMD vs. Nvidia. Certainly you can get around it, emulators for modern Console do just that, but it's not without it's issues which might be considered unacceptable.

      That's still fixable if you're willing to ship newly compiled shaders and such, but that's a lot more work if you're talking about needing some kind of per-game fix to be downloaded. This is how the XBox 360 "Backwards-compatibility" works, and this approach means it only works with a subset of XBox 360 games, not all of them. It's much better than nothing, but it's not a hardware-level fix that makes the original game binaries "just work".

      For packaging the old GPU with the new system, I think that's not really realistic anymore since prices for them simply don't drop enough and the system design would be a mess (the chips are huge and you'd need cooling for both chips. I guess if only one is running at a time then it's not as bad, but...). Separately, if you're swapping from Nvidia to AMD then you're talking about trying to convince one of them to make a batch of old chips for you while you use their competitor's chip as the the main one, they might not be willing to do it.

      • lxgr 3 days ago

        Would it not be possible to recompile all shaders at startup (or "install", i.e. first launch) time and then cache them (if runtime recompilation is even too slow in the first place)?

  • jm4 4 days ago

    The whole article seems unfair to Intel. They didn’t lose the contract because they didn’t have it in the first place. I think your analysis is correct. They win a little if they don’t get the contract and they win a lot if they do. It was a no brainer to bid on it.

  • neighbour 4 days ago

    This is all true. Xbox always threatens to leave their current vendors only to end up signing a renewal at the final hours of the contract.

    >As a gamedev I have a different perspective: Sony and Nintendo would be fools to give up backwards compatibility just for savings on chips.

    In your view, is this issue worse with modern consoles now that the Playstation (and possibly Nintendo) online store purchases persist across generations? Imagine a scenario where someone has a PS4 and PS5, they buy many games through the Playstation Store, then Sony selects a different chip supplier for the PS6. I'm guessing this would cause issues with games that were designed for the older consoles, breaking backwards compatibility.

    I'd imagine that if the console manufacturers cared about backwards compatibility, which I think they do, the likelihood of them switching chip providers would decrease with each generation.

    • wmf 4 days ago

      Microsoft maintained backwards compatibility across Intel+Nvidia, IBM+ATI, and AMD+AMD so it's possible. Sony hasn't invested as much in compatibility, instead just keeping the same architecture for PS4/5.

      • lxgr 3 days ago

        Sony has historically invested a lot into backwards compatibility, going as far as shipping the previous gen's GPU and/or CPU with the PS2, initial PS3 models, and the PS Vita.

        PS3 compatibility on the PS4 was notably absent, though.

      • neighbour 4 days ago

        True but if you're referring to the fact that you can play Xbox and Xbox 360 games on newer hardware, I believe Microsoft has a team that has to individually patch these games to work for newer hardware.

        Sony does something similar I believe with their new Classics Catalogue as part of their most premium PS Plus tier.

        • jamesfinlayson 3 days ago

          Yeah I remember the Xbox 360 being hit and miss with backwards compatibility - their FAQs said that most of the time the people working on it had to just look at the raw assembly of games they were trying to get running to figure out what went wrong.

      • etempleton 3 days ago

        Most games were not backwards compatible between Xbox and Xbox 360. They had to do work to make game work and prioritized the most popular games, most notably Halo. With that said, there were certain features that did not work properly. There was a Halo 2 map they took out of the online pool because it used a heavy fog effect that would not render on 360.

        From 360 to Xbox One there was a similar situation where they would patch individual games to work, but because it was at least partially emulated, publishers had to sign off on allowing their game to be backwards compatible.

    • lxgr 3 days ago

      There was no backwards compatibility between the PS3 and PS4 whatsoever (except for PS Plus allowing cloud-based game streaming of some PS3 titles), and Sony survived that as well.

      What they did was offer some old PS2 games for purchase, though, which allowed them to tap into that very large back catalog. I could see something like this happen for a hypothetical Intel PS6 as well (i.e. skipping PS5 backwards compatibility and tapping into the large catalog of PS4 and PS4/PS5 games).

  • aurareturn 3 days ago

    I’m pretty sure PS5 runs x86 and Vulcan. Both are standardized. That’s why PS5 games can be easily ported to PCs running Intel and Nvidia.

    So I’m not buying that going Intel would lose backwards compatibility.

    • pjmlp 3 days ago

      I am quite sure PS5 doesn't do Vulkan at all, and you even don't need a NDA access for that, there are enough GDC talks and SCEE presentations on the what APIs Playstations do support.

    • mastax 3 days ago

      It’s not clear to me that the PS5 supports Vulkan at all (excluding third party translation layers). I would be happy to see any evidence. In any case I’m confident the large majority of PS5 games use its native api GNM.

      GNM could certainly be implemented for Intel GPUs, but it’s an additional cost to account for.

  • johnnyanmac 4 days ago

    Yeah, this was rigged from the start. If Sony did want to take up Intel next gen, they'd need to do a lot of work on backwards compatibility with the PS5 on the PS6. Whereas I imagine the PS6 being a "PS5 Pro Pro" at this rate.

    I suppose it can be seen as controlling rampant greed (especially for Nvidia), but it feels like the consoles dealt the cards here. There would have needed to either be some revolutionary tech or an outright schism to make a business steer an otherwise smooth ship that way.

    >As a gamedev I have a different perspective: Sony and Nintendo would be fools to give up backwards compatibility just for savings on chips.

    I agree that both are probably playing it safe this time. But as a devil's advocate: both Sony and Nintendo are not strangers to ditching the previous gen if they don't want to compromise their next gen. At this point Nintendo is skewed towards ditching (SNES/N64/Gamecube/Switch vs. Wii/WiiU).

    Sony tried and almost failed hard with the PS3 (kind of before with the whole SKU debacle, and then ditched after) but is otherwise usually consistent on BC. Well, that and the Vita. But I don't think anyone missed the UMD (it was still backwards compatible digitally, though).

    • philistine 3 days ago

      > At this point Nintendo is skewed towards ditching (SNES/N64/Gamecube/Switch vs. Wii/WiiU).

      Ultimately, a company is its people. And the management class at Nintendo is famously new. Everybody is expecting them to focus on robust backwards compatibility as part of their new, exciting development.

    • Tuna-Fish 3 days ago

      > Whereas I imagine the PS6 being a "PS5 Pro Pro" at this rate.

      I think there will be sufficient time between now and PS6 release that they will be able to support full RTRT.

    • ac29 3 days ago

      > Yeah, this was rigged from the start. If Sony did want to take up Intel next gen, they'd need to do a lot of work on backwards compatibility with the PS5 on the PS6. Whereas I imagine the PS6 being a "PS5 Pro Pro" at this rate.

      Why would they need to do a lot more work on compatibility if they'd picked Intel vs AMD?

      Either CPU is presumably going to be x86_64. The GPU is almost certainly going to be much different between AMD's PS5 GPU and AMD's PS6 GPU, so the graphics driver will need work either way.

      • yangff 3 days ago

        They could have AMD provide a compatibility layer for the GPU (although this might be a bad idea), but implementing an AMD compatibility layer on Intel/NV clearly seems like an even worse idea. But at least you might be able to run the already compiled shaders in compatibility mode?

  • Meganet 3 days ago

    Those IPs are bought from expert companies right?

    I would assume if intel can make ARM and x86, it can do whatever sony needs.

    Or is AMDs architecture THAT special? My assumption is, that the ps3 streaming processor was so different, that it would have mattered but with ps4 and 5?

    You could also patch PS5 games if you need to. The ecosystem is closed.

  • zelon88 3 days ago

    > Switching vendors does not just invalidate old games compatibility, it also requires retooling for their internal libraries.

    This is a red herring. The hardware is x86-64, and all the game engines are made on x86-64, and all the games are compiled on, you guess it, x86-64. That's why they stopped using PowerPC, or Motorola, or other non-x86 architectures. To simplify backwards compatibility, and actually get comparable value to a decent performing system.

    So when they tell you there is a cost overhead associated with switching vendors, that is BS. However long it takes to port your desktop driver package is how long it would take to get all of this working on different hardware.

    Seriously, if someone in a basement in Arkansas can get Windows to run on a PowerPC PS3, Sony can figure out how to make x86-64 AMD games work on an x86-64 Intel chips. Anyone saying otherwise has incentive to not make it happen.

  • smcl 3 days ago

    I'm not convinced, this feels like those "actually this is good for bitcoin" replies that are popular with cryptobros anytime some bad news hits. Intel have lost out on a big, high-profile contract - this cannot be something they are happy with and any explanation to the contrary is, as the kids say, "cope"

bangaladore 4 days ago

Maybe I'm misinformed, but I could never see Intel getting this contract.

AMD has extensive experience with high-performing APUs, something Intel, at least in my memory, does not have. The chips on modern high-end consoles are supposed to compete with GPUs, not with integrated graphics. Does Intel even have any offerings that would indicate they could accomplish this? Intel has ARC, which presumably could be put in a custom "APU"; however, their track record with that is not stellar.

  • janice1999 4 days ago

    Intel has Battlemage [1]. Presumably that would be the basis of the console APU. Their iGPU performance is actually getting good now. [2]

    [1] https://www.pcgamer.com/hardware/graphics-cards/embargo-no-p...

    [2] https://www.tomshardware.com/pc-components/cpus/lunar-lake-i...

    • Scramblejams 4 days ago

      > Their iGPU performance is actually getting good now.

      I've only been waiting for Intel to ship a compelling iGPU since, I dunno, their "Extreme Graphics" in 2001? What on earth have their iGPU teams been doing over there for the last 20+ years?

      I guess the OEMs were blinkered enough not to demand it, and Intel management was blinkered enough not to see the upside on their own.

      • windowsrookie 3 days ago

        The Intel Iris Pro graphics from about 10 years ago were actually ok. I believe they were matching the lower-end dedicated laptop GPUs of that era. The problem was Apple was the only company willing to pay for the Iris Pro Chips.

      • DaoVeles 3 days ago

        I think what they have been doing is focusing on what 95% of people use these things for. Just basic utility based things. The most complex thing most people will render is Google Earth. I would not be surprised if that is probably the most like focus of performance for metrics Intel is using the iGPU for.

      • deelowe 4 days ago

        Intel didn't take gaming seriously until very recently. They stayed focused on productivity focused applications well past the time when netbooks became viable for most use cases.

    • adastra22 4 days ago

      Intel’s absolute best integrated GPU being roughly comparable to a lower end model from the competition is not “getting good.”

      • [removed] 3 days ago
        [deleted]
    • bangaladore 4 days ago

      The "Intel Core Ultra 7 258V" is at least 2-3x slower than the GPU within the PlayStation 5. It is not even close, and that's last gen. Again, the APUs within modern consoles compete with desktop grade GPUs. In the case of the PS5 its roughly comparable to an RTX 2070 or Rx 6700 (better analog).

      • aurareturn 3 days ago

        GPUs can be scaled with more cores and higher bandwidth memory. I assume had Intel won the contract, they would have done so.

      • wmf 4 days ago

        Multiple commenters here are forgetting about discrete Battlemage.

        • berbec 4 days ago

          And that's telling, isn't it? Even in this space, Intel's iGPUs are totally ignored or dismissed out of hand. I say it's because they have an unending string of broken promises, saying "This'll be the time we get integrated graphics right", over and over. It's never been true, and I for one have totally wiped them from my vision due to that.

  • pknomad 4 days ago

    Ditto. AMD also reliably delivered on CPUs for the past 2 iterations of both Xbox and PS. AMD feels like the only choice for consoles at this point.

    • coder543 4 days ago

      Well, Nvidia has powered a much more popular console... the Nintendo Switch, and Nvidia looks set to power the Switch 2 when it launches next year. So, AMD is clearly not the only choice.

      • mdasen 4 days ago

        The problem with choosing Nvidia is that they can't make an x86 processor with an integrated GPU. If you're looking to maintain backward compatibility with the Playstation 5, you're probably going to want to stick with an x86 chip. AMD has the rights to make x86 chips and it has the graphics chips to integrate.

        Nvidia has graphics chips, but it doesn't have the CPUs. Yes, Nvidia can make ARM CPUs, but they haven't been putting out amazing custom cores.

        AMD can simply repackage some Zen X cores with RDNA X GPU and with a little work have something Sony can use. Nvidia would need to either grab off-the-shelf ARM Cortex cores (like most of their ARM CPUs use) or Sony would need to bet that Nvidia could and would give them leading-edge performance on custom designed cores. But would Nvidia come in at a price that Sony would pay? Probably not. AMD's costs are probably a lot lower since they're going to be doing all that CPU work anyway for the rest of their business.

        For Nintendo, the calculus is a bit different. Nintendo is fine with off-the-shelf cores that are less powerful than smartphones and they're already on ARM so there's no backward incompatibility there. But for Sony whose business is different, it'd be a huge gamble.

      • pinewurst 4 days ago

        That's not an apples-to-apples comparison. Switch is lower price, lower performance by design and used, even originally, a mature NVIDIA SoC, not really a custom.

      • dathinab 3 days ago

        > much more popular console

        which isn't a useful metric because "being a good GPU" wasn't at all why the switch became successful, like you could say it became successful even through it had a pretty bad GPU. Through bad only in the perf. aspect as far as I can tell back then amd wasn't competitive on energy usage basis and maybe not on a price basis as the nvidea chips where a by product of Nvidea trying to enter the media/TV add on/handheld market with stuff like the Nvidea Shield.

        But yes AMD isn't the only choice, IMHO in difference to what many people seem to think for the price segment most consoles tend to target Intel is a viable choice, too. But then we are missing relevant insider information to properly judge that.

      • qwytw 3 days ago

        > the Nintendo Switch, and Nvidia looks set to power the Switch 2

        Which runs a very old mobile chip which was already outdated when the Switch came out. Unless Nintendo is planning to go with something high-end this time (e.g. to compete with the Steam Deck and other more powerful handhelds) whatever they get from Nvidia will probably be more or less equivalent to an mid-tier of the shelf Qualcomm SoC.

        It's interesting that Nvidia is going with that, it will just depress their margins. I guess they want to reenter the mobile CPU market and need something to show off.

  • [removed] 4 days ago
    [deleted]
  • dathinab 3 days ago

    > Intel has ARC, which presumably could be put in a custom "APU"; however, their track record with that is not stellar.

    I wouldn't exactly agree with that. ARC GPUs aren't really bad, sure when they where new there was for quite some time quite some driver issues but they have been mostly ironed out and where more in the "expected issues with first non iGPU" territory then "intel being very bad at their job" territory.

    Also GPUs in consoles (ignoring switch) are at the lower mid-class area today and that it's unlikely to change with future consoles, so that is a segment intel should be able to compete with. I mean console GPUs are more like big iGPS then dedicated GPUs.

    The main issue would be that weather it's intel, nvidea or amd their drivers have subtle but sometimes quite important differences in performance characteristics meaning that sometimes optimizations for one are de-optimizations for the other and similar interoperability issues. And they seem more likely with Intel as there is just much less history between the larger game engines and ARC GPUs.

    So IMHO Intel would have to offer a smaller price to be viable to compensate for more issues with backward compatibility, but if they where in a much better financial situation atm. I believe they would have had a high chance of getting it by subventioning it a bit so that they get a foothold on the marked and can compete without drawback next generation.

  • [removed] 4 days ago
    [deleted]
  • anemic 3 days ago

    Maybe the deal went south because Intel wanted it to be called Playstation 6 with Intel Integrated Graphics.

    And with a sticker on the front, of course.

johnklos 4 days ago

And they rightly deserve to lose the business to AMD.

Intel to Apple: "We're too big to deliver what you want for cell phones." Apple: "Ok. We'll use ARM."

Intel to Sony: "We're too big to commit to pricing, compatibility and volume." Sony:" Ok. We'll keep using AMD."

It's interesting that Intel keeps trying to ship "features", some of arguable utility but others that are decently helpful, like AVX-512, that now AMD delivers and Intel does not. I'm sure Sony didn't want a processor that can't properly and performantly run older and current titles.

  • tester756 4 days ago

    >Intel to Apple: "We're too big to deliver what you want for cell phones." Apple: "Ok. We'll use ARM."

    Reality:

    “We ended up not winning it or passing on it, depending on how you want to view it. And the world would have been a lot different if we’d done it. The thing you have to remember is that this was before the iPhone was introduced and no one knew what the iPhone would do… At the end of the day, there was a chip that they were interested in that they wanted to pay a certain price for and not a nickel more and that price was below our forecasted cost. I couldn’t see it. It wasn’t one of these things you can make up on volume. And in hindsight, the forecasted cost was wrong and the volume was 100x what anyone thought.”

    • fhdsgbbcaA 4 days ago

      This is from the horses mouth, and reliable as such. However, it does give the impression that they weren’t sufficiently interested to think more creatively about cost optimization, because they were riding the gravy train of Wintel ruling the world. So I think root comment isn’t too far off.

      • MBCook 3 days ago

        Right. It’s an accurate quote but that doesn’t mean it’s an accurate analysis.

        Not only did they not seem to understand the possibilities in front of them, their chips were not well positioned at all to win. They were too hot and too power-hungry because Intel didn’t care much about efficiency at the time.

        They were taking the “shrink a big chip” path. Apple, using ARM from Samsung then their own , ended up taking the “grow a little chip” path.

        Which is a little bit ironic because Intel made their fortune on the “little” desktop processor that grew up to take over all the servers from main frames and the “big boy“ server chips like the SPARC and Alpha.

        They became the big boys and history started repeating.

      • silvestrov 3 days ago

        > no one knew what the iPhone would do

        When you are the CEO of Intel you should be able to see/forecast what smartphones would do in the market.

        The iPhone wasn't completely new. Nokia already had some "little smart" phones on the market already.

        The only real surprise was Apple's ability to get a US phone company on board with selling the iPhone and losing grip on what software that was installed on the phones.

        • polar 3 days ago

          > Nokia already had some "little smart" phones on the market already.

          So did other hardware/software vendors, and many of them were a lot smarter than the iPhone.

    • windowsrookie 3 days ago

      Intel made ARM chips, then sold that portion of the company in 2006, shortly before the iPhone was announced.

      https://en.wikipedia.org/wiki/XScale

      It was incredibly bad timing. If intel had continued making ARM chips they could be in an entirely different position today.

      • tester756 3 days ago

        >It was incredibly bad timing. If intel had continued making ARM chips they could be in an entirely different position today.

        How so?

        ARM (ISA) doesn't imply performance characteristics nor significant advantage over x86

    • toast0 4 days ago

      IMO, more interesting than Intel not doing the iPhone is Intel ending atom for phones right before Microsoft demoed Continuum for Windows Mobile 10. That would have been a much different product on an x86 phone, IMHO. Maybe it would have been enough of an exciting feature that Microsoft would have not botched the Windows Mobile 10 release.

    • jiqiren 3 days ago

      The key in this quote is: "in hindsight, the forecasted cost was wrong"

      100% intel screwup.

epolanski 4 days ago

Not sure the title has the right framing.

It's hard to compete with AMD which is the only tech company to offer both x86 and a solid GPU technology that comes with it.

On top of that you have backwards compatibility woes and the uncertainty around Intel being able to deliver on its foundry.

All in all, this win would've been a great deal for Intel's foundry in PR, but money wise those were never going to be huge sums.

  • 0xcde4c3db 4 days ago

    Backward compatibility guarantees is a significant one, I think. A lot of the QA process for console games is predicated on testing against a fixed set of hardware configurations, and various race conditions and other weirdness can start crawling out of the woodwork even with modest changes. This has been seen on many games running on emulators, on hacked console firmwares that allow overclocking (e.g. by running the CPU at the "native" clock speed in backward compatibility mode), or with framerate unlocking patches.

  • jeroenhd 3 days ago

    Intel's Arc GPUs are quite competent (especially with the highly necessary driver updates). If Battlemage fixed the hardware scheduling design flaw, Intel has a decent shot at competing with AMD.

    If AMD continues to lose ground on the desktop market and Intel continues to advance with Arc, there's a chance the PS6/Xbox Series 360 will run on Intel instead of AMD.

  • hypercube33 4 days ago

    AMD also has a track record for Sony and consoles in general dating back to the game cube and delivering success. Maybe not the fastest thing but one that works and is reliable. Nvidia, IBM and Intel don't exactly deliver on the full suite either.

  • ChocolateGod 4 days ago

    > both x86 and a solid GPU technology that comes with it

    If only Project Denver had kept its original goal

    • wmf 4 days ago

      Transmeta and Denver never had great performance. If you want an x86 CPU it's so much safer to go with AMD.

      • pinewurst 4 days ago

        Plus Denver was constrained in x86 compatibility by Intel patents.

whalesalad 4 days ago

Intel hasn't made a console CPU/GPU since... the original Xbox?

AMD has done: Gamecube, Wii, Xbox 360 (gpu, not cpu), Xbox one, PS4, PS5 ...

mastazi 4 days ago

According to the writer everything in tech is AI. It bothers me and makes it difficult to take the article seriously.

> Similar to how big tech companies like Google and Amazon rely on outside vendors to help design and manufacture custom AI chips

> Having missed the first wave of the AI boom dominated by Nvidia and AMD, Intel reported a disastrous second quarter in August.

  • MBCook 4 days ago

    > It bothers me and makes it difficult to take the article seriously.

    But if you’re in the chip game AI is the big thing of the last 10 years. It’s driven a huge chunk of new sales and demand for upgraded choices than they likely would have seen otherwise.

    Having missed out on AI in many ways (nVidia was perfectly positioned, AMD better than Intel) they need stuff to keep growing.

    Their current business is looking shakier than any time in recent history. ARM is getting pretty realistic on the desktop. Apple proved it and now Samsung and Qualcomm have parts for Windows users that perform well enough (compared to the failure of early ARM on Windows).

    They’re behind on selling silicon for AI to business and it’s not clear consumers care enough to upgrade their PCs. And when consumers upgrade they have not only great options from AMD, doing better than ever, but the ARM threat.

    They’re being squeezed on all sides. The PS6 wouldn’t make them dominant but it would have been a very steady and reliable revenue stream for years and a chance at parlaying that into additional business. “See what we did for Sony? We can do that for you.”

    The article seemed rather well done to me. I think you’re being too dismissive in this case.

    • mastazi 3 days ago

      > But if you’re in the chip game AI is the big thing of the last 10 years.

      IMHO, AMD having done well despite being woefully unprepared for the recent AI wave suggests that AI is not the only big thing

      (edit: grammar)

  • deelowe 4 days ago

    I mean, it's very likely next gen consoles will feature AI hardware. The PS5 pro is already touting it.

    • xcv123 4 days ago

      The GPU is "AI hardware", and current PS5 already has it.

      • j_maffe 3 days ago

        Custom architecture optimized for ML is a thing.

  • [removed] 3 days ago
    [deleted]
apexalpha 3 days ago

Title is a bit weird; AMD has been the supplier for PS4 and PS5 already and will continue to supply the PS6.

I guess Intel lost the bidding process but they never had the 'Playstation business' in the first place.

Nevertheless, an interesting read.

eigenform 4 days ago

I wonder if Sony having to adapt their DRM/platform security strategy into Intel-world would've introduced a lot of friction.

This kind of thing is probably part of the motivation behind Intel splitting out a "Partner Security Engine."

nottorp 3 days ago

> Intel and AMD were the final two contenders in the bidding process for the contract.

That's an interesting question. Will either Sony or MS break backwards compatibility by going away from x86 again in the future? Definitely not with the next console generation.

On the CPU side, MS does have good x86-on-arm emulation from their brand new windows arm so it's conceivable. Not sure how bad it would be on the GPU side.

  • ThatPlayer 3 days ago

    Games aren't running on GPUs directly; they're using APIs like DirectX for Xbox. As long as the GPU implements the APIs properly it should be fine. RISC-V Linux with a desktop PCI-E AMD GPU and Linux kernel drivers can run games already: https://youtu.be/qHLKB39xVkw, limited by the power of the RISC-V CPU here.

    I'm wondering if they would still aim for a single chip when moving to ARM. AMD (and Intel) don't make ARM chips. Nvidia does, and is probably what the next Switch will use. Qualcomm does have proper DX12 support on Windows ARM, but who knows how that'll scale since they make mostly mobile GPUs. Intel had similar problems scaling their iGPUs for Arc.

langsoul-com 4 days ago

Why did amd win the console business? It seems that even though they weren't number 1, they were always on most consoles.

  • JonChesterfield 4 days ago

    I believe x64 won the PS4 era because games devs were deeply sick of targeting a special purpose architecture for the console and also x86 for the PC game port. At the time all desktop computers for games were x86 based IBM clone things.

    I don't know why AMD ended up with the PS4 and the xbox as opposed to Intel getting either, but x86 was probably inevitable. I wonder if these days something architecturally similar to the mac arm systems would be a reasonable alternative.

    • MBCook 3 days ago

      You’re right they didn’t want to target weird things like the PS3 anymore, that was a huge pain.

      But you have to remember the other side: there were no other options than x86.

      ARM wasn’t powerful enough for one of the high-end consoles at the time. Trying to have such a chip designed would be a lot more expensive than just choosing a premade design and tweaking at a little. It worked for Nintendo but they had different goals. I’m not sure ARM could’ve been used it for the PlayStation 4 or Xbox One.

      The previous supplier, IBM PowerPC, had basically given up. Apple switched off them for the same reason. It wasn’t getting much faster and IBM only really seemed interested in server chips. I think it’s reasonable to assume they wouldn’t have tried very hard to win a chance to make a faster console chip.

      If you don’t want to design your own, that’s all the major players. x86-64 is all that’s left.

      Which is not to say that was a bad option. Developers are extremely familiar with it, there’s a metric ton of tools available, it makes game porting to and from PCs easier, it was the highest performance option, and there are two big suppliers that you can play against each other. So even if you went with Intel and something happened you could switch to AMD.

      When IBM decided they didn’t care about the market, you just had to leave the PowerPC.

      Why AMD over Intel? They were probably hungrier since they were in second place. They had a competitive GPU business, which Intel didn’t. Single supplier + they could do everything on one chip. And if AMD makes both parts they can help optimize the hell out of it.

      Microsoft got screwed by nVidia on the XBox. I don’t think they’d want to do that again. Sony would absolutely know that happened and be wary.

      Honestly it’s not clear to me that nVidia cared too much. But maybe I’m just reading it wrong. Nintendo went with them because they had and all in one system on a chip they were willing to dump for cheap that thanks to its mobile heritage developers were already familiar with.

      Also since AMD was the little guy (compared to Intel) they could really use the sales and the revenue. It would be a bigger percentage of their total income than Intel, meaning it was more important to win that contract.

      So in the end I think it makes a lot of sense that PlayStation and Xbox ended up in AMD land.

    • philistine 3 days ago

      > [..] something architecturally similar to the mac arm systems would be a reasonable alternative.

      It's called the Nintendo Switch, and its the second best selling console of all time.

      • MBCook 3 days ago

        It was also targeting a very different capability level from the PS4/XBox One.

    • someNameIG 3 days ago

      ARM would be a reasonable alternative. Unity/UE5 already support it due to mobile and Nintendo Switch, and consoles are usually more power/thermally constrained than desktops, so AMR in many ways would be a better alternative than x64.

      Plus PlayStation is big enough that if they went ARM, game devs would have to follow.

      • MBCook 3 days ago

        But were there fast/powerful enough ARM chips to be competitive with what ended up in the PS4/XBox One?

        They certainly exist today. But could Sony and Microsoft have chosen them or would they have had to have them developed?

        • someNameIG 3 days ago

          I'm not sure, the Jaguar CPUs in them were pretty underpowered at the time too, they were tablet/netbook level. I think they were in some ways a bit of a downgrade in CPU performance compared to the Cell in the PS3.

          Biggest issue at the time would have been an ARM CPU with a decent GPU if they wanted a SoC instead of having them separate dies

  • netcoyote 3 days ago

    This is just a hypothesis, but I wonder if it’s simply that AMD was willing to accept lower margins to keep their business going, where INTC wasn’t willing to compete because they’re comparing the contract to the higher margin sales they made in the PC business?

    I mean, sure, technical issues and such too, but mature businesses have a hard time accepting lower margins because it hurts their stock market metrics.

jandrese 4 days ago

It's a little vague what the "6 chips" would have been. CPU obviously. Probably some southbridge equivalent, but then what? A NIC? Was Intel going to supply the graphics chip too? That would have been a real turnaround for their GPU division.

  • bhouston 4 days ago

    There wasn't 6 chips. I believe it refers to "PlayStation 6", the version that comes after "PlayStation 5".

    I think the main thing Intel was competing for was CPU + GPU given they have the new Xe graphics architecture which is decent. But I guess they could have just gone after the CPU with NVIDIA likely then supplying the GPU as they did in the first Xbox.

    • jandrese 4 days ago

      Oh derp. You are right. I totally misread that.

    • deelowe 4 days ago

      I bet Intels hubris is too strong to allow them to do that these days.

andrewstuart 4 days ago

Everything is about GPUs these days.

Be it little GPUs inside the CPU package or be it consumer GPUs or big GPUs in data centers.

Unless Intel can start to get its GPU act together, it won't be leading the industry again in a hurry.

  • MBCook 4 days ago

    They’re getting better. But as the article mentions backwards compatibility is a huge lock in factor. It’s way easier for AMD to achieve it than it would be for Intel (or anyone else) following the PS5.

criticalfault 4 days ago

If we are talking foundry, Intel could still manufacture amd chips in their own fab instead of tsmc. Given that 18A is good...

This would be strange, but it would show Intel will do what it takes.

  • bgnn 4 days ago

    They have to do that to stay in fab business. Their design business is gonna be sold out I think, like HP/Agilent.

motbus3 3 days ago

Since intel got fat gov and cloud contracts, nothing else matters

voytec 4 days ago

s/Playstation/PlayStation 6/ - next generation

[removed] 4 days ago
[deleted]
lapinovski 4 days ago

playstation 6 already?

  • wmf 4 days ago

    It will probably be released around 2027-2028.

    • MBCook 4 days ago

      And with the PS5 Pro announced we know the PS5 is basically dead inside Sony’s engineering organization.

      Other than a die shrink or chip reduction or something there’s not much else to do. The designs are basically done.

      So all hardware work would currently be focused on the PS6. At least for the home console line.

      • Narishma 3 days ago

        The PS5 is definitely not dead, it's their main console. The Pro is a niche machine targeted at wealthy enthusiasts. It will only do a fraction of the regular PS5 sales numbers.

  • [removed] 3 days ago
    [deleted]