Comment by Danieru

Comment by Danieru 4 days ago

57 replies

No one appears to have mentioned the important meta game going on: Intel bidding as a credible alternative supplier.

For Intel, by bidding they get to undercut AMD's profits.

For Sony, they get a credible alternative which they can pretend would be a viable choice. Thus forcing a slightly better deal from AMD.

We saw similar articles related to the Switch 2. That time it was AMD acting as spoiler to Nvidia. Nvidia reportedly got the contract. That time too we got news articles lamenting this loss for AMD.

As a gamedev I have a different perspective: Sony and Nintendo would be fools to give up backwards compatibility just for savings on chips.

Switching vendors does not just invalidate old games compatibility, it also requires retooling for their internal libraries. Console games, outside small or open source engines, use proprietary graphics api. Those apis are tied to the hardware. With this coming generation from Nintendo, and the "current gen" from Sony and Xbox they've been able to mostly reuse much of their software investment. I'd case more but this is obviously nda, other devs should be able to confirm.

Thus I don't think AMD for switch2 or Intel for ps6 was ever a credible path. Their bids existed to keep the existing vendor from getting overly greedy and ruining the parade for everyone. This is important, famously the original Xbox got hamstrung in the market by Nvidia's greed and refusal to lower prices as costs went down.

senkora 4 days ago

+1. An important non-obvious detail for AMD is that they (at least in the past, I assume for this as well) have kept the instruction timings similar from generation to generation of consoles.

Different x86 micro-architectures benefit from writing the machine code in slightly different ways. Games are highly optimized to the specific micro-architecture of the console, so keeping that stable helps game developers optimize for the console. If you suddenly changed the micro-architecture (if switching to Intel), then old games could suddenly become janky and slow even though both systems are x86.

(This would only matter if you were pushing performance to the edge, which is why it rarely matters for general software development, but console game dev pushes to the edge)

So it isn't just the graphics APIs that would change going from AMD to Intel, but the CPU performance as well.

  • deaddodo 4 days ago

    > Different x86 micro-architectures benefit from writing the machine code in slightly different ways. Games are highly optimized to the specific micro-architecture of the console, so keeping that stable helps game developers optimize for the console.

    While that can be true, very few gamedev companies these days optimize to that degree. They almost all use off-the-shelf middleware and game engines that are built to support all of the platforms. The companies that do go through that effort tend to have very notable releases.

    Nobody is hand-tuning Assembler code these days to fit into tight instruction windows. At least, not outside of some very specific logic fragments. Instead they're all writing generic interrupt-based logic. Which is fine, as that's what the newer CPUs expect and optimize for internally.

    In addition, the difference in the Zen generation gap is as different as switching to Intel. We're talking fairly different cache coherency, memory hierarchies, CCX methodologies, micro-op and instruction timings, iGPU configurations, etc.

    That all being said, AMD was going to beat Intel regardless because of established business relationships and their current internal struggles (both business-wise and R&D) making it fairly difficult for them to provide an equivalent alternative.

    • soganess 3 days ago

      Asking this as an open ended (if leading) question: I assume enough people are doing it otherwise PS5 Pro makes no sense... Right?

      They (AMD/Sony) shoehorned the RDNA 3/3.5 GPU architecture onto an older Zen 2 core, with a different process node, because... they felt like making a frankenAPU? Especially since the APUs are usually monolithic (vs chiplet) in design and share a memory controller. Surely it would have been easier/cheaper to put in 8 zen 4c/5c cores and call it a day.

      I'm pretty sure I'm just missing something obvious...

      • wmf 3 days ago

        For PlayStation APUs, it's likely that AMD presents a menu of options and Sony chooses which components they want. For PS5 Pro, the CPU is unchanged from PS5 because Sony doesn't feel the need for anything faster. A newer CPU would take more area. But Sony really wanted better raytracing and AI so they chose RDNA 3.9 or whatever for the GPU. I suspect the cores are all mostly synthesized so they can support any process and Infinity Fabric is compatible enough that you can mix and match new and old cores.

      • deaddodo 2 days ago

        > They (AMD/Sony) shoehorned the RDNA 3/3.5 GPU architecture onto an older Zen 2 core

        The original core was already a custom configuration. I don't see why it seems odd that the new version would be a custom configuration based on the previous one.

        > with a different process node

        This doesn't apply to the PS5 SoC, but is general to AMD's methodology.

        AMD has been using an off-chip interposer setup for multiple generations now. They did this specifically to allow for different process nodes for different chips.

        It's cheaper (and there are more fab options) to produce chips at a lower process node. If there's no reason to update the CPU, it would make sense to keep it on the cheaper option.

        In regards to the PS5 and Xbox SoCs, specifically.

        The entirety of the SoC is fabbed at the same process node. A core designed for a 14nm process node and then fabbed at 7nm (assuming drastic changes weren't needed to make it function at the lower node) is going to be much smaller and run cooler on that node size. This is cheaper and leaves more space in the total footprint for the GPU-specific and auxiliary logic cores. Same rule applies above, why use more if it's not needed.

        > they felt like making a frankenAPU

        All of the game console chips are "frankenAPUs".

        > Especially since the APUs are usually monolithic (vs chiplet) in design and share a memory controller.

        "Monolithic" vs "chiplet" is an arbitrary distinction, in this case. The individual logic cores are still independent and joined together with interposers and glue logic. This is clear from the die shots:

        https://videocardz.com/newz/sony-playstation-5-soc-die-pictu...

        To return to the previous point, look at the space dedicated to the CCXs. The Zen2 has ~1.9bln transistors, the Zen3 ~4.1bln, the Zen4 ~6.6bln, etc. To use a newer core would double or triple that space. Increasing the total die size, making it more expensive per chip and increasing the defect rate.

        > Surely it would have been easier/cheaper to put in 8 zen 4c/5c cores and call it a day.

        Definitely not.

        > I'm pretty sure I'm just missing something obvious...

        Nothing about chip design is obvious.

      • pjmlp 3 days ago

        PS5 Pro makes no sense, yes.

        Most studios aren't even able to push current PS 5 to its limit, given current development schedules and budgets.

        PS 5 Pro is for the same target audience as PS 4 Pro, hardcode console fans that will buy whatever the console vendor puts out, and Sony needs to improve their margins.

    • MichaelZuo 4 days ago

      How would you explain cross PS5/PC releases being much more efficient on the PS5?

      e.g. Horizon Forbidden West needing a much better GPU on PC to run at the same level of fidelity as the PS5.

      If not for special tuning specific to the PS5’s differences.

      (I can imagine Windows bloat and other junk requiring an additional 10% to 20%, but not 30% to 50%.)

      • jitl 4 days ago

        The comment above is elaborating on x86 micro-architecture, the differences between how the CPU handles x86 instructions specifically.

        The overall system architecture is different between PC, which has discrete memory systems for the CPU and GPU, and a very long pathway between GPU memory and system/CPU memory, versus today's consoles which have unified memory for CPU+GPU, and optimized pathways for loading from persistent storage too.

        Consolesuse their own graphics APIs, but you would have any vendor you contract with for graphics support your native graphics API and everything would be "fine". PS5 games use GNM/GNMX Playstation proprietary graphics APIs. Usually PC ports of console native games re-implement the rendering engine using the PC graphics APIs like DirectX or Vulkan. The re-implementation is probably less efficient and less tuned.

      • yangff 4 days ago

        Horizon Forbidden West was ported from PS to PC. Decima is an engine from Sony’s first-party studio, so it's understandable that their development process would lean more towards PS's internal architecture rather than the more common GPUs on the market. Of course, even general-purpose engines can perform better on PS5, AMD, or NV. But, for these engines, they have less information about how customers will use the engines, so there's less infomation can be used to optimize. On the other side, customers using these engines often don’t have enough experience optimizing sufficiently for each platform. None of this is absolute, but I think this logic is reasonable.

        For game developers using these engines, if they take optimization seriously, they typically make adjustments to lighting, LOD, loading, and model details or shaders on console platforms to achieve a similar visual effect while meeting the targeted performance goals. This is why you usually get better performance on a console at the same price point compared to a PC, aside from the subsidies provided by Sony.

      • tacticus 3 days ago

        > Horizon Forbidden West needing a much better GPU on PC to run at the same level of fidelity as the PS5.

        not being expected to run with variable refresh rate\interleaving and accepting 30\60 fps in best case situations?

      • deaddodo 3 days ago

        > They almost all use off-the-shelf middleware and game engines that are built to support all of the platforms. The companies that do go through that effort tend to have very notable releases.

    • HelloNurse 3 days ago

      And, more simply, Moore's Law should ensure that in a next-generation console with a new microprocessor architecture slowdown in some instructions and memory access patterns is compensated by general speedup, limiting performance regressions to terribly unfortunate cases (which should be unlikely and so obvious that they are mitigated).

  • mikepavone 4 days ago

    > An important non-obvious detail for AMD is that they (at least in the past, I assume for this as well) have kept the instruction timings similar from generation to generation of consoles.

    What? The Jaguar-based CPU in the PS4 has both a much lower clock and substantially lower IPC than the Zen 2 based one in the PS5. The timings are not remotely the same and the micro-architectures are quite different. Jaguar was an evolution of the Bobcat core which was AMD's answer to the Intel Atom at the time (i.e. low cost and low-power, though it was at least an out-of-order core unlike contemporary Atoms).

    Going from GCN to RDNA on the GPU side is also a pretty significant architectural change, though definitely much less than the going from AMD to Intel would be.

    • senkora 4 days ago

      I did some more research and I was wrong.

      My source was an AMD tech talk from years ago where they mentioned keeping instruction timings the same for backwards compatibility reasons.

      I believe they were talking about this for the XBox One X: https://en.wikichip.org/wiki/microsoft/scorpio_engine#Overvi... (and a similar chip for the PS4 Pro)

      So basically, they upgraded and lightly enhanced the Jaguar architecture, shrunk the process (28nm -> 16nm), but otherwise kept it the same. AMD Zen was released around this time and was far superior but they decided to stick with Jaguar in order to make sure that instruction timings were kept the same.

      I guess that they didn't want two hardware revisions of the same console generation running on different micro-architectures, but they were okay switching the micro-architecture for the next console generation.

  • jheriko 4 days ago

    you clearly haven't played a modern game :P

    cpu timings taken care around by developers is 10-15 years out of date. most of them these days dont even know what a dot product is, how to find the distance to a point or a straight line in-between two... and the people they rely on to do this for them make horrendous meals of it.

    but yeah, sure, cpu instruction timings matter.

    • DaoVeles 4 days ago

      I was about to say. I bailed out of the industry just as the Xbox One/Ps4 was coming in. Even with the 360/Ps3, it was considered wise to try and steer clear of that kind of low level stuff just for ones sanity. When the X1/Ps4 came in, it was completely abandoned, turns out x86 compilers combined with OoO execution just made that kind of tinkering not only nearly pointless but sometime actually hurt performance.

      Nowadays,I suspect it is almost entirely in the hands of the compilers, the API's and the base OS to figure out the gritty details.

      • xgkickt 3 days ago

        There are still manual optimizations that can be done (non-temporal writes where appropriate for example), but nothing like the painstaking removal of Load-Hit-Stores and cache control of the 360/PS3 era.

    • Meganet 3 days ago

      The new chip will be relevant faster. I would bet that bandwidth between certain components is a lot more critical. Or NUNA or bandwidth between cores.

      Im surprised that cpu instruction latency is mentioned before other

  • lxgr 4 days ago

    Given the size of such a contract, wouldn't it be reasonable for Sony to just request equal or better instruction latency for everything relevant from the old CPU?

ksec 4 days ago

Adding a bit more context.

Nvidia got the bid for Switch when they were basically dumping those unwanted Tegra to Nintendo for an incredibly low price.

Xbox and Playstation dont earn AMD much profits at all. AMD had this Custom Processor segment to barely keep them surviving, people may forget AMD was only worth ~$3B market cap in 2016. They are now at ~$250B.

On the subject of software compatibility though, one thing I got it wrong was my prediction of having AAA titles on Xbox and PS would have helped AMD's market share on PC, given those titles are already optimised on Xbox and PS anyway. That didn't happen at all. And Nvidia continue to dominate.

  • elzbardico 4 days ago

    Sometimes a low margin business is all you need and have to keep the lights on, don't hemorrhage too much people and stay afloat until you get better winds.

    Traditional MBA thinking sometimes is too short sighted. For example, PCs might not have been a Cash Cow for IBM, but the Thinkpad brand, the distributor relationships and the customer may had helped IBM more than the cash out selling this business to Lenovo. Maybe having a healthy bridge head with a popular brand of laptops could have helped IBM coming up with some innovative way of selling the overhyped Watson.

    The same with AMD and videogames, it paid the bills, paid salaries and left a little profit on the table to be invested. Probably it helped them bridge from their hell days to what they are today.

    There's a lot of intangibles and hidden symmetries, serendipitous opportunities that are frequently overlooked by our bean-counting master race overlords.

  • sangnoir 4 days ago

    > Xbox and Playstation dont earn AMD much profits at all

    It doesn't cost them much either. Lisa Su, in an interview that was posted to HN a few months ago, said it is a deliberate strategy to repackage IP AMD has already developed. They are willing to pull designs from the shelf and customize it to meet partners needs. Having a long tail of products adds up, and sets you up to get first dibs on higher margin partnerships in the future.

  • derstander 3 days ago

    > Nvidia got the bid for Switch when they were basically dumping those unwanted Tegra to Nintendo for an incredibly low price.

    This seems pretty well aligned with Gunpei Yokoi’s strategy of “Lateral Thinking [with] Withered Technology”. It worked out pretty well for Nintendo in the past (e.g., Gameboy) and seems to be working out with the Switch. Even though he has passed, his Wikipedia page alleges that this philosophy has been passed on to others at Nintendo.

    • lynguist 3 days ago

      > Withered technology

      At the time of its release the Nintendo Switch’s CPU was only a single generation behind the latest offering by ARM; and its GPU was by far the most powerful mobile GPU available. It doesn’t hold true for Switch.

      What happened is that mobile compute has advanced tremendously since 2017 and Switch is stuck on technology that was leading in early 2017.

      • pjmlp 2 days ago

        While providing marvelous gaming experiences, faster polygons doesn't equate better games, and is specially an issue in latest gen PlayStation and XBox where many games with great graphics have lousy gameplay experience at high prices.

  • DaoVeles 4 days ago

    A few of the Playstation titles that made their way to PC do seem to have a little home field advantage on AMD chips, but not enough to sway people over to them.

  • lupusreal 3 days ago

    > having AAA titles on Xbox and PS would have helped AMD's market share on PC, given those titles are already optimised on Xbox and PS anyway. That didn't happen at all. And Nvidia continue to dominate.

    My impression is that console ports have insufficient popularity with PC gamers for them to alter their hardware purchasing habits for those games.

lxgr 4 days ago

> Sony and Nintendo would be fools to give up backwards compatibility just for savings on chips.

But would they really?

Staying on x86-64 would take care of CPU compatibility (unless there's some exotic AMD-only instruction set extension heavily used by PS4/5 games), and a GPU emulation stack seems at least somewhat plausible.

Sony has pulled this off multiple times before with seemingly completely incompatible architectures:

The PS2 came with the PS1 CPU (repurposed as an IO controller, but fully available for previous-gen games) and emulated the GPU on its own. The PS3 did the reverse in its second iteration (i.e. it included the PS2's GPU and emulated the CPU). The PS Vita's SoC had the PSP MIPS CPU included on-die, which in turn is similar enough to the PS1's to allow running those games too.

  • DSMan195276 3 days ago

    For GPU emulation, I'm not super knowledgeable but I would think the shaders are a big issue, older systems don't have that problem. Console games come with precompiled shaders and you won't be able to reuse those between AMD vs. Nvidia. Certainly you can get around it, emulators for modern Console do just that, but it's not without it's issues which might be considered unacceptable.

    That's still fixable if you're willing to ship newly compiled shaders and such, but that's a lot more work if you're talking about needing some kind of per-game fix to be downloaded. This is how the XBox 360 "Backwards-compatibility" works, and this approach means it only works with a subset of XBox 360 games, not all of them. It's much better than nothing, but it's not a hardware-level fix that makes the original game binaries "just work".

    For packaging the old GPU with the new system, I think that's not really realistic anymore since prices for them simply don't drop enough and the system design would be a mess (the chips are huge and you'd need cooling for both chips. I guess if only one is running at a time then it's not as bad, but...). Separately, if you're swapping from Nvidia to AMD then you're talking about trying to convince one of them to make a batch of old chips for you while you use their competitor's chip as the the main one, they might not be willing to do it.

    • lxgr 3 days ago

      Would it not be possible to recompile all shaders at startup (or "install", i.e. first launch) time and then cache them (if runtime recompilation is even too slow in the first place)?

jm4 4 days ago

The whole article seems unfair to Intel. They didn’t lose the contract because they didn’t have it in the first place. I think your analysis is correct. They win a little if they don’t get the contract and they win a lot if they do. It was a no brainer to bid on it.

neighbour 4 days ago

This is all true. Xbox always threatens to leave their current vendors only to end up signing a renewal at the final hours of the contract.

>As a gamedev I have a different perspective: Sony and Nintendo would be fools to give up backwards compatibility just for savings on chips.

In your view, is this issue worse with modern consoles now that the Playstation (and possibly Nintendo) online store purchases persist across generations? Imagine a scenario where someone has a PS4 and PS5, they buy many games through the Playstation Store, then Sony selects a different chip supplier for the PS6. I'm guessing this would cause issues with games that were designed for the older consoles, breaking backwards compatibility.

I'd imagine that if the console manufacturers cared about backwards compatibility, which I think they do, the likelihood of them switching chip providers would decrease with each generation.

  • wmf 4 days ago

    Microsoft maintained backwards compatibility across Intel+Nvidia, IBM+ATI, and AMD+AMD so it's possible. Sony hasn't invested as much in compatibility, instead just keeping the same architecture for PS4/5.

    • lxgr 3 days ago

      Sony has historically invested a lot into backwards compatibility, going as far as shipping the previous gen's GPU and/or CPU with the PS2, initial PS3 models, and the PS Vita.

      PS3 compatibility on the PS4 was notably absent, though.

      • nottorp 3 days ago

        Historically. But not presently.

        They could include a software emulator at least for the PS2 (not PS1 because afaik the drive in the PS5 does not read CDs) on the PS5 and let people use old discs, but they don't and instead sell again old games packaged with the emulator in their online store.

    • neighbour 4 days ago

      True but if you're referring to the fact that you can play Xbox and Xbox 360 games on newer hardware, I believe Microsoft has a team that has to individually patch these games to work for newer hardware.

      Sony does something similar I believe with their new Classics Catalogue as part of their most premium PS Plus tier.

      • jamesfinlayson 4 days ago

        Yeah I remember the Xbox 360 being hit and miss with backwards compatibility - their FAQs said that most of the time the people working on it had to just look at the raw assembly of games they were trying to get running to figure out what went wrong.

    • etempleton 3 days ago

      Most games were not backwards compatible between Xbox and Xbox 360. They had to do work to make game work and prioritized the most popular games, most notably Halo. With that said, there were certain features that did not work properly. There was a Halo 2 map they took out of the online pool because it used a heavy fog effect that would not render on 360.

      From 360 to Xbox One there was a similar situation where they would patch individual games to work, but because it was at least partially emulated, publishers had to sign off on allowing their game to be backwards compatible.

  • lxgr 3 days ago

    There was no backwards compatibility between the PS3 and PS4 whatsoever (except for PS Plus allowing cloud-based game streaming of some PS3 titles), and Sony survived that as well.

    What they did was offer some old PS2 games for purchase, though, which allowed them to tap into that very large back catalog. I could see something like this happen for a hypothetical Intel PS6 as well (i.e. skipping PS5 backwards compatibility and tapping into the large catalog of PS4 and PS4/PS5 games).

aurareturn 3 days ago

I’m pretty sure PS5 runs x86 and Vulcan. Both are standardized. That’s why PS5 games can be easily ported to PCs running Intel and Nvidia.

So I’m not buying that going Intel would lose backwards compatibility.

  • pjmlp 3 days ago

    I am quite sure PS5 doesn't do Vulkan at all, and you even don't need a NDA access for that, there are enough GDC talks and SCEE presentations on the what APIs Playstations do support.

  • mastax 3 days ago

    It’s not clear to me that the PS5 supports Vulkan at all (excluding third party translation layers). I would be happy to see any evidence. In any case I’m confident the large majority of PS5 games use its native api GNM.

    GNM could certainly be implemented for Intel GPUs, but it’s an additional cost to account for.

johnnyanmac 4 days ago

Yeah, this was rigged from the start. If Sony did want to take up Intel next gen, they'd need to do a lot of work on backwards compatibility with the PS5 on the PS6. Whereas I imagine the PS6 being a "PS5 Pro Pro" at this rate.

I suppose it can be seen as controlling rampant greed (especially for Nvidia), but it feels like the consoles dealt the cards here. There would have needed to either be some revolutionary tech or an outright schism to make a business steer an otherwise smooth ship that way.

>As a gamedev I have a different perspective: Sony and Nintendo would be fools to give up backwards compatibility just for savings on chips.

I agree that both are probably playing it safe this time. But as a devil's advocate: both Sony and Nintendo are not strangers to ditching the previous gen if they don't want to compromise their next gen. At this point Nintendo is skewed towards ditching (SNES/N64/Gamecube/Switch vs. Wii/WiiU).

Sony tried and almost failed hard with the PS3 (kind of before with the whole SKU debacle, and then ditched after) but is otherwise usually consistent on BC. Well, that and the Vita. But I don't think anyone missed the UMD (it was still backwards compatible digitally, though).

  • philistine 4 days ago

    > At this point Nintendo is skewed towards ditching (SNES/N64/Gamecube/Switch vs. Wii/WiiU).

    Ultimately, a company is its people. And the management class at Nintendo is famously new. Everybody is expecting them to focus on robust backwards compatibility as part of their new, exciting development.

  • Tuna-Fish 3 days ago

    > Whereas I imagine the PS6 being a "PS5 Pro Pro" at this rate.

    I think there will be sufficient time between now and PS6 release that they will be able to support full RTRT.

  • ac29 4 days ago

    > Yeah, this was rigged from the start. If Sony did want to take up Intel next gen, they'd need to do a lot of work on backwards compatibility with the PS5 on the PS6. Whereas I imagine the PS6 being a "PS5 Pro Pro" at this rate.

    Why would they need to do a lot more work on compatibility if they'd picked Intel vs AMD?

    Either CPU is presumably going to be x86_64. The GPU is almost certainly going to be much different between AMD's PS5 GPU and AMD's PS6 GPU, so the graphics driver will need work either way.

    • yangff 3 days ago

      They could have AMD provide a compatibility layer for the GPU (although this might be a bad idea), but implementing an AMD compatibility layer on Intel/NV clearly seems like an even worse idea. But at least you might be able to run the already compiled shaders in compatibility mode?

Meganet 3 days ago

Those IPs are bought from expert companies right?

I would assume if intel can make ARM and x86, it can do whatever sony needs.

Or is AMDs architecture THAT special? My assumption is, that the ps3 streaming processor was so different, that it would have mattered but with ps4 and 5?

You could also patch PS5 games if you need to. The ecosystem is closed.

zelon88 3 days ago

> Switching vendors does not just invalidate old games compatibility, it also requires retooling for their internal libraries.

This is a red herring. The hardware is x86-64, and all the game engines are made on x86-64, and all the games are compiled on, you guess it, x86-64. That's why they stopped using PowerPC, or Motorola, or other non-x86 architectures. To simplify backwards compatibility, and actually get comparable value to a decent performing system.

So when they tell you there is a cost overhead associated with switching vendors, that is BS. However long it takes to port your desktop driver package is how long it would take to get all of this working on different hardware.

Seriously, if someone in a basement in Arkansas can get Windows to run on a PowerPC PS3, Sony can figure out how to make x86-64 AMD games work on an x86-64 Intel chips. Anyone saying otherwise has incentive to not make it happen.

smcl 3 days ago

I'm not convinced, this feels like those "actually this is good for bitcoin" replies that are popular with cryptobros anytime some bad news hits. Intel have lost out on a big, high-profile contract - this cannot be something they are happy with and any explanation to the contrary is, as the kids say, "cope"