Comment by bangaladore

Comment by bangaladore 4 days ago

62 replies

Maybe I'm misinformed, but I could never see Intel getting this contract.

AMD has extensive experience with high-performing APUs, something Intel, at least in my memory, does not have. The chips on modern high-end consoles are supposed to compete with GPUs, not with integrated graphics. Does Intel even have any offerings that would indicate they could accomplish this? Intel has ARC, which presumably could be put in a custom "APU"; however, their track record with that is not stellar.

janice1999 4 days ago

Intel has Battlemage [1]. Presumably that would be the basis of the console APU. Their iGPU performance is actually getting good now. [2]

[1] https://www.pcgamer.com/hardware/graphics-cards/embargo-no-p...

[2] https://www.tomshardware.com/pc-components/cpus/lunar-lake-i...

  • Scramblejams 4 days ago

    > Their iGPU performance is actually getting good now.

    I've only been waiting for Intel to ship a compelling iGPU since, I dunno, their "Extreme Graphics" in 2001? What on earth have their iGPU teams been doing over there for the last 20+ years?

    I guess the OEMs were blinkered enough not to demand it, and Intel management was blinkered enough not to see the upside on their own.

    • windowsrookie 4 days ago

      The Intel Iris Pro graphics from about 10 years ago were actually ok. I believe they were matching the lower-end dedicated laptop GPUs of that era. The problem was Apple was the only company willing to pay for the Iris Pro Chips.

      • kcb 3 days ago

        The other problem is Intel's graphics drivers for 3d gaming are a distant 3rd place. Games just haven't historically targeted their GPUs. We've had like 2 decades of games that for the most part have tested compatibility with Nvidia and AMD.

      • onli 3 days ago

        Those even were on the desktop, for a very short while, with the broadwell processor i5-5675C and i7-5775C.They were stronger than the FM2+-Apus AMD had released earlier, that Intel otherwise could not beat for years, just weaker than the following Ryzen Apus.

        Ofc gone in the next generation. But those widely available might have changed things.

    • DaoVeles 4 days ago

      I think what they have been doing is focusing on what 95% of people use these things for. Just basic utility based things. The most complex thing most people will render is Google Earth. I would not be surprised if that is probably the most like focus of performance for metrics Intel is using the iGPU for.

    • deelowe 4 days ago

      Intel didn't take gaming seriously until very recently. They stayed focused on productivity focused applications well past the time when netbooks became viable for most use cases.

  • adastra22 4 days ago

    Intel’s absolute best integrated GPU being roughly comparable to a lower end model from the competition is not “getting good.”

    • [removed] 3 days ago
      [deleted]
  • bangaladore 4 days ago

    The "Intel Core Ultra 7 258V" is at least 2-3x slower than the GPU within the PlayStation 5. It is not even close, and that's last gen. Again, the APUs within modern consoles compete with desktop grade GPUs. In the case of the PS5 its roughly comparable to an RTX 2070 or Rx 6700 (better analog).

    • aurareturn 3 days ago

      GPUs can be scaled with more cores and higher bandwidth memory. I assume had Intel won the contract, they would have done so.

    • wmf 4 days ago

      Multiple commenters here are forgetting about discrete Battlemage.

      • berbec 4 days ago

        And that's telling, isn't it? Even in this space, Intel's iGPUs are totally ignored or dismissed out of hand. I say it's because they have an unending string of broken promises, saying "This'll be the time we get integrated graphics right", over and over. It's never been true, and I for one have totally wiped them from my vision due to that.

pknomad 4 days ago

Ditto. AMD also reliably delivered on CPUs for the past 2 iterations of both Xbox and PS. AMD feels like the only choice for consoles at this point.

  • coder543 4 days ago

    Well, Nvidia has powered a much more popular console... the Nintendo Switch, and Nvidia looks set to power the Switch 2 when it launches next year. So, AMD is clearly not the only choice.

    • mdasen 4 days ago

      The problem with choosing Nvidia is that they can't make an x86 processor with an integrated GPU. If you're looking to maintain backward compatibility with the Playstation 5, you're probably going to want to stick with an x86 chip. AMD has the rights to make x86 chips and it has the graphics chips to integrate.

      Nvidia has graphics chips, but it doesn't have the CPUs. Yes, Nvidia can make ARM CPUs, but they haven't been putting out amazing custom cores.

      AMD can simply repackage some Zen X cores with RDNA X GPU and with a little work have something Sony can use. Nvidia would need to either grab off-the-shelf ARM Cortex cores (like most of their ARM CPUs use) or Sony would need to bet that Nvidia could and would give them leading-edge performance on custom designed cores. But would Nvidia come in at a price that Sony would pay? Probably not. AMD's costs are probably a lot lower since they're going to be doing all that CPU work anyway for the rest of their business.

      For Nintendo, the calculus is a bit different. Nintendo is fine with off-the-shelf cores that are less powerful than smartphones and they're already on ARM so there's no backward incompatibility there. But for Sony whose business is different, it'd be a huge gamble.

      • coder543 4 days ago

        I think changing from AMD GPUs to Nvidia GPUs by itself has a good chance of breaking backwards compatibility with how low level and custom Sony's GPU API apparently is, so the CPU core architecture would just be a secondary concern.

        I was not saying Sony should switch to Nvidia, just pointing out that it is objectively incorrect to say that AMD is the only option for consoles when the most popular console today does not rely on AMD.

        I also fully believe Intel could scale up an integrated Battlemage to meet Sony's needs, but is it worth the break in compatibility? Is it worth the added risk when Intel's 13th and 14th gen CPUs have had such publicly documented stability issues? I believe the answer to both questions is "probably not."

      • kmeisthax 4 days ago

        Emulating x86 would be an option - though given Sony's history, I doubt they'd consider it seriously.

        For context...

        - PS1 BC on PS2 was mostly hardware but they (AFAIK?) had to write some code to translate PS1 GPU commands to the PS2 GS. That's why you could forcibly enable bilinear filtering on PS1 games. Later on they got rid of the PS1 CPU / "IO processor" and replaced it with a PPC chip ("Deckard") running a MIPS emulator.

        - PS1 BC on PS3 was entirely software; though the Deckard PS2s make this not entirely unprecedented. Sony had already written POPS for PS1 downloads on PS2 BBN[0] and PSP's PS1 Classics, so they knew how to emulate a PS1.

        - PS2 BC on PS3 was a nightmare. Originally it was all hardware[1], but then they dropped the EE+GS combo chip and went to GPU emulation, then they dropped the PS2 CPU entirely and all backwards compatibility with it. Then they actually wrote a PS2 emulator anyway, which is part of the firmware, but only allowed to be used with PS2 Classics and not as BC. I guess they consider the purchase price of the games on the shop to also pay for the emulator?

        - No BC was attempted on PS4 at all, AFAIK. PS3 is a weird basketcase of an architecture, but even PS1 or PS2 aren't BC supported.

        At some point Sony gave up on software emulation and decided it's only worth it for retro re-releases where they can carefully control what games run on the emulator and, more importantly, charge you for each re-release. At least the PS4 versions will still play on a PS5... and PS6... right?

        [0] A Japan-only PS2 application that served as a replacement for the built-in OSD and let you connect to and download software demos, game trailers, and so on. Also has an e-mail client.

        [1] Or at least as "all hardware" as the Deckard PS2s are

      • alexjplant 4 days ago

        > Nvidia has graphics chips, but it doesn't have the CPUs. Yes, Nvidia can make ARM CPUs, but they haven't been putting out amazing custom cores.

        Ignorant question - do they have to? The last time I was up on gaming hardware it seemed as though most workloads were GPU-bound and that having a higher-end GPU was more important than having a blazing fast CPU. GPUs have also grown much more flexible rendering pipelines as game engines have gotten much more sophisticated and, presumably, parallelized. Would it not make sense for Nvidia to crank out a cost-optimized design comprising their last-gen GPU architecture with 12 ARM cores on an affordable node size?

        The reason I ask is because I've been reading a lot about 90s console architectures recently. My understanding is that back then the CPU and specialized co-processors had to do a lot of heavy lifting on geometry calculations before telling the display hardware what to draw. In contrast I think most contemporary GPU designs take care of all of the vertex calculations themselves and therefore free the CPU up a lot in this regard. If you have an entity-based game engine and are able to split that object graph into well-defined clusters you can probably parallelize the simulation and scale horizontally decently well. Given these trends I'd think a bunch of cheaper cores could work as well for cheaper than higher-end ones.

      • kcb 3 days ago

        Nvidia has very little desire to make a high-end razor thin margin chip that consoles traditionally demand. This is what Jensen has said, and it makes sense when there are other areas that the silicon can be directed to with much greater profit.

      • FileSorter 3 days ago

        >The problem with choosing Nvidia is that they can't make an x86 processor with an integrated GPU

        Can't and not being allowed are two very different things

    • pinewurst 4 days ago

      That's not an apples-to-apples comparison. Switch is lower price, lower performance by design and used, even originally, a mature NVIDIA SoC, not really a custom.

    • dathinab 3 days ago

      > much more popular console

      which isn't a useful metric because "being a good GPU" wasn't at all why the switch became successful, like you could say it became successful even through it had a pretty bad GPU. Through bad only in the perf. aspect as far as I can tell back then amd wasn't competitive on energy usage basis and maybe not on a price basis as the nvidea chips where a by product of Nvidea trying to enter the media/TV add on/handheld market with stuff like the Nvidea Shield.

      But yes AMD isn't the only choice, IMHO in difference to what many people seem to think for the price segment most consoles tend to target Intel is a viable choice, too. But then we are missing relevant insider information to properly judge that.

    • qwytw 3 days ago

      > the Nintendo Switch, and Nvidia looks set to power the Switch 2

      Which runs a very old mobile chip which was already outdated when the Switch came out. Unless Nintendo is planning to go with something high-end this time (e.g. to compete with the Steam Deck and other more powerful handhelds) whatever they get from Nvidia will probably be more or less equivalent to an mid-tier of the shelf Qualcomm SoC.

      It's interesting that Nvidia is going with that, it will just depress their margins. I guess they want to reenter the mobile CPU market and need something to show off.

      • coder543 3 days ago

        We already have a good sense of what SoC Nintendo will likely be going with for the Switch 2.

        Being so dismissive of the Switch shows the disconnect between what most gamers care about, and what some tech enthusiasts think gamers care about.

        The Switch 1 used a crappy mobile chip, sure, but it was able to run tons of games that no other Tegra device could have dreamed of running, due to the power of having a stable target for optimization, with sufficiently powerful APIs available, and a huge target market. The Switch 1 can do 90% of what a Steam Deck can, while using a fraction of the power, thickness, and cooling. With the Switch 2 almost certainly gaining DLSS, I fully expect the Switch 2 to run circles around the Steam Deck, even without a “high end chip”. It will be weaker on paper, but that won’t matter.

        I say this as someone who owns a PS5, a Switch OLED, an ROG Ally, and a fairly decent gaming PC. I briefly had an original Steam Deck, but the screen was atrocious.

        Most people I see talking about Steam Deck’s awesomeness seem to either have very little experience with a Switch, or just have a lot of disdain for Nintendo. Yes, having access to Steam games is cool… but hauling around a massive device with short battery life is not cool to most gamers, and neither is spending forever tweaking settings just to get something that’s marginally better than the Switch 1 can do out of the box.

        The Switch 1 is at the end of its life right now, but Nintendo is certainly preparing the hardware for the next 6 to 8 years.

[removed] 4 days ago
[deleted]
dathinab 3 days ago

> Intel has ARC, which presumably could be put in a custom "APU"; however, their track record with that is not stellar.

I wouldn't exactly agree with that. ARC GPUs aren't really bad, sure when they where new there was for quite some time quite some driver issues but they have been mostly ironed out and where more in the "expected issues with first non iGPU" territory then "intel being very bad at their job" territory.

Also GPUs in consoles (ignoring switch) are at the lower mid-class area today and that it's unlikely to change with future consoles, so that is a segment intel should be able to compete with. I mean console GPUs are more like big iGPS then dedicated GPUs.

The main issue would be that weather it's intel, nvidea or amd their drivers have subtle but sometimes quite important differences in performance characteristics meaning that sometimes optimizations for one are de-optimizations for the other and similar interoperability issues. And they seem more likely with Intel as there is just much less history between the larger game engines and ARC GPUs.

So IMHO Intel would have to offer a smaller price to be viable to compensate for more issues with backward compatibility, but if they where in a much better financial situation atm. I believe they would have had a high chance of getting it by subventioning it a bit so that they get a foothold on the marked and can compete without drawback next generation.

[removed] 4 days ago
[deleted]
anemic 3 days ago

Maybe the deal went south because Intel wanted it to be called Playstation 6 with Intel Integrated Graphics.

And with a sticker on the front, of course.