AMD and Sony's PS6 chipset aims to rethink the current graphics pipeline
(arstechnica.com)322 points by zdw 3 days ago
322 points by zdw 3 days ago
The amount of drama about AI based upscaling seems disproportionate. I know framing it in terms of AI and hallucinated pixels makes it sound unnatural, but graphics rendering works with so many hacks and approximations.
Even without modern deep-learning based "AI", it's not like the pixels you see with traditional rendering pipelines were all artisanal and curated.
AI upscaling is equivalent to lowering bitrate of compressed video.
Given netflix popularity, most people obviously don’t value image quality as much as other factors.
And it’s even true for myself. For gaming, given the choice of 30fps at a higher bitrate, or 60fps at a lower one, I’ll take the 60fps.
But I want high bitrate and high fps. I am certainly not going to celebrate the reduction in image quality.
> AI upscaling is equivalent to lowering bitrate of compressed video.
When I was a kid people had dozens of CDs with movies, while pretty much nobody had DVDs. DVD was simply too expensive, while Xvid allowed to compress entire movie into a CD while keeping good quality. Of course original DVD release would've been better, but we were too poor, and watching ten movies at 80% quality was better than watching one movie at 100% quality.
DLSS allows to effectively quadruple FPS with minimal subjective quality impact. Of course natively rendered image would've been better, but most people are simply too poor to buy game rig that plays newest games 4k 120FPS on maximum settings. You can keep arguing as much as you want that natively rendered image is better, but unless you send me money to buy a new PC, I'll keep using DLSS.
People have different sensitivities. For me personally, the reduction in image quality is very noticeable.
I am playing on a 55” TV at computer monitor distance, so the difference between a true 4K image and an upscaled one is very significant.
The contentious part from what I get is the overhead for hallucinating these pixels, on cards that also cost a lot more than the previous generation for otherwise minimal gains outside of DLSS.
Some [0] are seeing 20 to 30% drop in actual frames when activating DLSS, and that means as much latency as well.
There's still games where it should be a decent tradeoff (racing or flight simulators ? Infinite Nikki ?), but it's definitely not a no-brainer.
Not sure I'd put much faith into the word of someone who uses Amonsgold clips and seems to look for drama where there is none.
Asking gaming youtube channels, especially those focusing on FPS and performance, to be clinical and drama free is a tall order.
There are a lot of theoretical arguments I could give you about how almost all cases where hardware BVH can be used, there are better and smarter algorithms to be using instead. Being proud of your hardware BVH implementation is kind of like being proud of your ultra-optimised hardware bubblesort implementation.
But how about a practical argument instead. Enabling raytracing in games tends to suck. The graphical improvements on offer are simply not worth the performance cost.
A common argument is that we don't have fast enough hardware yet, or developers haven't been able to use raytracing to it's fullest yet, but it's been a pretty long damn time since this hardware was mainstream.
I think the most damning evidence of this is the just released Battlefield 6. This is a franchise that previously had raytracing as a top-level feature. This new release doesn't support it, doesn't intend to support it.
And in a world where basically every AAA release is panned for performance problems, BF6 has articles like this: https://www.pcgamer.com/hardware/battlefield-6-this-is-what-...
> But how about a practical argument instead. Enabling raytracing in games tends to suck. The graphical improvements on offer are simply not worth the performance cost.
Pretty much this - even in games that have good ray tracing, I can't tell when it's off or on (except for the FPS hit) - I cared so little I bought a card not known to be good at it (7900XTX) because the two games I play the most don't support it anyway.
They oversold the technology/benefits and I wasn't buying it.
> Enabling raytracing in games tends to suck.
Because enabling raytracing means the game supports non-raytracing too. Which limits the game's design on how they can take advantage of raytracing being realtime.
The only exception to this I've seen The Finals: https://youtu.be/MxkRJ_7sg8Y . Made by ex-Battlefield devs, the dynamic environment from them 2 years ago is on a whole other level even compared to Battlefield 6.
There's also Metro: Exodus, which the developers have re-made to only support RT lighting. DigitalFoundry made a nice video on it: https://www.youtube.com/watch?v=NbpZCSf4_Yk
> But how about a practical argument instead.
With raytracing lighting a scene goes from taking hours-days to just designating objects that emit light
naive q: could games detect when the user is "looking around" at breathtaking scenery and raytrace those? offer a button to "take picture" and let the user specify how long to raytrace? then for heavy action and motion, ditch the raytracing? even better, as the user passes through "scenic" areas, automatically take pictures in the background. Heck, this could be an upsell kind of like the RL pictures you get on the roller coaster... #donthate
(sorry if obvious / already done)
It will never be fast enough to work in real time without compromising some aspect of the player's experience.
Ray tracing is solving the light transport problem in the hardest way possible. Each additional bounce adds exponentially more computational complexity. The control flows are also very branchy when you start getting into the wild indirect lighting scenarios. GPUs prefer straight SIMD flows, not wild, hierarchical rabbit hole exploration. Disney still uses CPU based render farms. There's no way you are reasonably emulating that experience in <16ms.
The closest thing we have to functional ray tracing for gaming is light mapping. This is effectively just ray tracing done ahead of time, but the advantage is you can bake for hours to get insanely accurate light maps and then push 200+ fps on moderate hardware. It's almost like you are cheating the universe when this is done well.
The human brain has a built in TAA solution that excels as frame latencies drop into single digit milliseconds.
The problem is the demand for dynamic content in AAA games. Large exterior and interior worlds with dynamic lights, day and night cycle, glass and translucent objects, mirrors, water, fog and smoke. Everything should be interactable and destructable. And everything should be easy to setup by artists.
I would say, the closest we can get are workarounds like radiance cascades. But everything else than raytracing is just an ugly workaround which falls apart in dynamic scenarios. And don't forget that baking times and storing those results, leading to massive game sizes, are a huge negative.
Funnily enough raytracing is also just an approximation to the real world, but at least artists and devs can expect it to work everywhere without hacks (in theory).
Manually placed lights and baking not only takes time away from iteration but also takes a lot of disk space for the shadow maps. RT makes development faster for the artists, I think DF even mentioned that doing Doom Eternal without RT would take so much disk space it wouldn’t be possible to ship it.
edit: not Doom Etenral, it’s Doom The Dark Ages, the latest one.
It's fast enough today. Metro Exodus, an RT-only game runs just fine at around 60 fps for me on a 3060 Ti. Looks gorgeous.
Light mapping is a cute trick and the reason why Mirror's Edge still looks so good after all these years, but it requires doing away with dynamic lighting, which is a non-starter for most games.
I want my true-to-life dynamic lighting in games thank you very much.
> it requires doing away with dynamic lighting
Most modern engines support (and encourage) use of a mixed lighting mode. You can have the best of both worlds. One directional RT light probably isn't going to ruin the pudding if the rest of the lights are baked.
How is Metro Exodus Enhanced Edition (that is purely raytraced) compromised compared to regular version that uses traditional lighting?
Much higher resource demands, which then requires tricks like upscaling to compensate. Also you get uneven competition between GPU vendors because it is not hardware ray tracing but Nvidia raytracing in practice.
On a more subjective note, you get less interesting art styles because studio somehow have to cram raytracing as a value proposition in there.
Not OP, but a lot of the current kvetching about hardware based ray tracing is that it’s basically an nvidia-exclusive party trick, similar to DLSS and physx. AMD has this inferiority complex where nvidia must not be allowed to innovate with a hardware+software solution, it must be pure hardware so AMD can compete on their terms.
1. People somehow think that just because today's hardware can't handle RT all that well it will never be able to. A laughable position of course.
2. People turn on RT in games not designed with it in mind and therefore observe only minor graphical improvements for vastly reduced performance. Simple chicken-and-egg problem, hardware improvements will fix it.
The gimmicks aren't the product, and the customers of frontier technologies aren't the consumers. The gamers and redditors and smartphone fanatics, the fleets of people who dutifully buy, are the QA teams.
In accelerated compute, the largest areas of interest for advancement are 1) simulation and modeling and 2) learning and inference.
That's why this doesn't make sense to a lot of people. Sony and AMD aren't trying to extend current trends, they're leveraging their portfolios to make the advancements that will shape future markets 20-40 years out. It's really quite bold.
So far the AI upscaling/interpolating has just been used to ship horribly optimized games with a somewhat acceptable framerate
And they're achieving "acceptable" frame rates and resolutions by sacrificing image quality in ways that aren't as easily quantified, so those downsides can be swept under the rug. Nobody's graphics benchmark emits metrics for how much ghosting is caused by the temporal antialiasing, or how much blurring the RT denoiser causes (or how much noise makes it past the denoiser). But they make for great static screenshots.
I disagree. From what I’ve read if the game can leverage RT the artists save a considerable amount of time when iterating the level designs. Before RT they had to place lights manually and any change to the level involved a lot of rework. This also saves storage since there’s no need to bake shadow maps.
So what stops the developers from iterating on a raytraced version of the game during development, and then executing a shadow precalcualtion step once the game is ready to be shipped? Make it an option to download, like the high resolution texture packs. They are offloading processing power and energy requirements to do so on consumer PCs, and do so in an very inefficient manner
I wish devs would just make PS2 level graphics and let GenAI take care of the rest, the wait times for each AAA game is nuts.
So this is AMD catching up with Nvidia in the RT and AI upscaling/frame gen fields. Nothing wrong with it, and I am quite happy as an AMD GPU owner and Linux user.
But the way it is framed as a revolutionary step and as a Sony collab is a tad misleading. AMD is competent enough to do it by itself, and this will definitely show up in PC and the competing Xbox.
I think we don't have enough details to make statements like this yet. Sony have shown they are willing to make esoteric gaming hardware in the past (cell architecture) and maybe they'll do something unique again this time. Or, maybe it'll just use a moderately custom model. Or, maybe it's just going to use exactly what AMD have planned for the next few year anyway (as you say). Time will tell.
I'm rooting for something unique because I haven't owned a console for 20 years and I like interesting hardware. But hopefully they've learned a lesson about developer ergonomics this time around.
>Sony have shown they are willing to make esoteric gaming hardware in the past (cell architecture)
Just so we’re clear, you’re talking about a decision that didn’t really pan out made over 20 years ago.
PS6 will be an upgraded PS5 without question. You aren’t ever going to see a massive divergence away from the PC everyone took the last twenty years working towards.
The landscape favors Microsoft, but they’ll drop the ball, again.
> you’re talking about a decision that didn’t really pan out made over 20 years ago.
The PS3 sold 87m units, and more importantly, it sold more than the Xbox 360, so I think it panned out fine even if we shouldn't call it a roaring success.
It did sell less than the PS2 or PS4 but I don't think the had much to do with the cell architecture.
Game developer hated it, but that's a different issue.
I do agree that a truly unusual architecture like this is very unlikely for the next gen though.
Nintendo is getting it right (maybe): focus on first-party exclusive games and, uh, a pile of indies and ports from the PS3 and PS4 eras.
Come to think of it, Sony is also stuck in the PS4 era since PS5 pro is basically a PS4 pro that plays most of the same games but at 4K/60. (Though it does add a fast SSD and nice haptics on the DualSense controller.) But it's really about the games, and we haven't seen a lot of system seller exclusives on the PS5 that aren't on PS4, PC, or other consoles. (Though I'm partial to Astro-bot and also enjoyed timed exclusives like FF16 and FF7 Rebirth.)
PS5 and Switch 2 are still great gaming consoles - PS5 is cheaper than many GPU cards, while Switch 2 competes favorably with Steam Deck as a handheld and hybrid game system.
This. I would also add Returnal to this list but otherwise I agree, It's hard to believe it's been almost 5 years since the release of PS5 and there are still barely any games that look as good as The Last Of Us 2 or Red Dead Redemption 2 which were released on PS4
I would agree with this. A lot of PS5 games using UE5+ with all it's features run at sub 1080p30 (some times sub 720p30)upscaled to 1440p/4K and still look & run way, way worse that TLOU2/RDR2/Death Stranding 1/Horizon 1 on the PS4. Death Stranding 2, Horizon 2, and the Demon's Souls remake look and run far, far better (on a purely technical level) than any other PS5 game and they all use rasterized lighting.
I really dislike the focus on graphics here, but I think a lot of people are missing big chunk of the article that's focused on efficiency.
If we can get high texture + throughput content like dual 4k streams but with 1080p bandwidth, we can get VR that isn't as janky. If we can get lower power consumption, we can get smaller (and cooler) form functions which means we might see a future where the Playstation Portal is the console itself. I'm about to get on a flight to Sweden, and I'd kill to have something like my Steam Deck but running way cooler, way more powerful, and less prone to render errors.
I get the feeling Sony will definitely focus on graphics as that's been their play since the 90s, but my word if we get a monumental form factor shift and native VR support that feels closer to the promise on paper, that could be a game changer.
Seems like the philosophy here is, if you're going to do AI-based rendering, might as well try it across different parts of the graphics pipeline and see if you can fine-tune it at the silicon level. Probably a microoptimization, but if it makes the PS6 look a tiny bit better than the Xbox, people will pay for that.
Hopefully their game lineup is not as underwhelming as the ps5 one.
underwhelming? what do you mean?
every year, Playstation ranks very high when it comes to GOTY nominations
just last year, Playstation had the most nominations for GOTY: https://x.com/thegameawards/status/1858558789320142971
not only that, but PS5 has more 1st party games than Microsoft's Xbox S|X
1053 vs 812 (that got inflated with recent Activision acquisition)
https://en.wikipedia.org/wiki/List_of_PlayStation_5_games
https://en.wikipedia.org/wiki/List_of_Xbox_Series_X_and_Seri...
It's important to check the facts before spreading random FUD
PS5 had the strongest lineup of games this generation, hence why they sold this many consoles
Still today, consumers are attracted to PS5's lineup, and this is corroborated by facts and data https://www.vgchartz.com/
In August for example, the ratio between PS5 and Xbox is 8:1; almost as good as the new Nintendo Switch 2, and the console is almost 5 years old!
You say "underwhelming", people are saying otherwise
Yeah, I don’t recall a single original game from the PS5 exclusive lineup (that wasn’t available for PS4). We did get some remakes and sequels, but the PS5 lineup pales in comparison to the PS4 one.
Also, to my knowledge, the PS5 still lags behind the PS4 in terms of sales, despite the significant boost that COVID-19 provided.
The PS4 lineup pales in comparison to the PS3 lineup, which pales in comparison to the PS2 lineup, which pales in comparison to the PS1 lineup.
Each generation has around half the number of games as the previous. This does get a bit murky with the advent of shovelware in online stores, but my point remains.
I think this only proves is that games are now ridiculously expensive to create and met the quality standards expected. Maybe AI will improved this in this future. Take-Two has confirmed that GTA6's budget has exceeded US$1 billion, which is mind-blowing.
The most extreme example of this is that Naughty Dog, one of Sony's flagship first-party studios, has still yet to release a single original game for the PS5 after nearly five years. They've steadily been making fewer and fewer brand new games each generation and it's looking like they may only release one this time around. AAA development cycles are out of control.
Returnal is probably one the best 1st party games available and it’s a PS5 exclusive.
Its sequel Saros is coming out next year too.
There’s also Spider-Man 2, Ratchet and Clank Rift Apart, Astro Bot, Death Stranding 2, Ghost of Yotei…
Their output hasn’t been worse than the PS4 at all imo.
Are you talking about this Returnal? https://store.steampowered.com/agecheck/app/1649240/
There's simply no point in buying that console when it has like what, 7 exclusive titles that aren't shovelware? 7 titles after 5 years? And this number keeps going down because games are constantly being ported to other systems.
>constantly being ported to other systems.
And why wouldn’t they? In many cases they’re are some compiler settings and a few drivers away from working.
Digital Foundry just released a video discussing this:
Yes, duh. It's a console, resolution scaling is the #1 foremost tool in their arsenal for stabilizing the framerate. I can't think of a console game made in the past decade that doesn't "fake frames" at some part of the pipeline.
I'll also go a step further - not every machine-learning pass is frame generation. Nvidia uses AI for DLAA, a form of DLSS that works with 100% input resolution as a denoiser/antialiasing combined pass. It's absolutely excellent if your GPU can keep up with the displayed content.
I wonder how many variants of the PS6 they'll go through before they get a NIC that works right.
As someone working at an ISP, I am frustrated with how bad Sony has mangled the networking stack on these consoles. I thought BSD was supposed to be the best in breed of networking but instead Sony has found all sorts of magical ways to make it Not Work.
From the PS5 variants that just hate 802.11ax to all the gamers making wild suggestions like changing MTU settings or DNS settings just to make your games work online... man, does Sony make it a pain for us to troubleshoot when they wreck it.
Bonus points that they took away the Web browser so we can't even try to do forward-facing troubleshooting without going through an obtuse process of the third-party-account-linking system to sneak out of the process to run a proper speedtest to Speedtest/Fast to show that "no, it's PSN being slow, not us".
There sure is a lot of visionary(tm) thinking out there right now about the future of gaming, But what strikes me is how few of those visionaries(tm) have ever actually developed and taken a game to market.
Not entirely unlike how many AI academics who step functioned their compensation a decade ago by pivoting to the tech industry had no experience bringing an AI product to market, but they certainly felt free pontificate on how things are done.
I eagerly await the shakeout due from the weakly efficient market as the future of gaming ends up looking like nothing anyone imagineered.
"Uh oh, I don't like that sound of that..."
clicks article
"Project Amethyst is focused on going beyond traditional rasterization techniques that don't scale well when you try to "brute force that with raw power alone," Huynh said in the video. Instead, the new architecture is focused on more efficient running of the kinds of machine-learning-based neural networks behind AMD's FSR upscaling technology and Sony's similar PSSR system."
"Yep..."
Sigh.
Cell processor 2: electric boogaloo
Seems they didn’t learn from the PS3, and that exotic architectures don't drive sales. Gamers don’t give a shit and devs won’t choose it unless they have a lucrative first party contract.
Custom graphics architectures aren't always a disaster - the Switch 2 is putting up impressive results with their in-house DLSS acceleration.
Now, shackling yourself to AMD and expecting a miracle... that I cannot say is a good idea. Maybe Cerny has seen something we haven't, who knows.
The Switch 1 also had CUDA cores and other basic hardware accelerators. To my knowledge (and I could be wrong), none of the APIs that Nintendo exposed even gave access to those fancy features. It should just be calls to NVN, which can be compiled into Vulkan the same way DXVK translates DirectX calls.
It's better off if I let Digital Foundry take it from here: https://youtu.be/BDvf1gsMgmY
TL:DW - it's not quite the full-fat CNN model but it's also not a uselessly pared-back upscaler. Seems to handle antialiasing and simple upscale well at super low TDPs (<10w).
Both raytracing and NPUs use a lot of bandwidth and that is scaling the least with time. Time will tell if just going for more programmable compute would be better
A new PS console already?
PS5 will be remembered as the worst PS generation.
I can't help but think that Sony and AMD would be better off developing a GPU-style PCI-card module that has all their DRM and compute and storage on the board, and then selling consoles that are just normal gaming PCs in a conveniently-sized branded case with a PS card installed. If the card was sold separately at $3-400 it would instantly take over a chunk of the PC gaming market and upgrades would be easier.
Maybe Sony should focus on getting a half-respectable library out on the PS5 before touting the theoretical merits of the PS6? It’s kind of wild how thin they are this go around. Their live service gambles clearly cost them this cycle and the PSVR2 landed with a thud.
Frankly after releasing the $700 pro and going “it’s basically the same specs but it can actually do 4K60 this time we promise” and given how many friends I have with the PS5 sitting around as an expensive paper weight, I can’t see a world where I get a PS6 despite decades of console gaming. The PS5 is an oversized final fantasy machine supported by remakes/remasters of all their hits from the PS3/PS4 era. It’s kind of striking when you look at the most popular games on the console.
Don’t even get me started on Xbox lol
It really doesn’t though. The library stacked against PS4’s doesn’t even compare unless you want to count cross platform and even then PS4 still smokes it. The fact that Helldivers 2 is one of the only breakout successes they’ve had (and it didn’t even come from one of their internal studios) says everything. And of course they let it go cross platform too so that edge is gone now. All their best studios were tied up with live service games that have all been canceled. They wasted 5+ years and probably billions if we include the missed out sales. The PS4 was heavily driven by their close partner/internal teams and continue to carry a significant portion of the PS5’s playerbase.
If you don’t need Final Fantasy or to (re)play improved PS4 games, the PS5 is an expensive paperweight and you may as well just grab a series S or something for half the price, half the shelf space, and play 90% of the same games.
Let me ask you this: should we really be taking this console seriously if they’re about to go an entire cycle without naughty dog releasing a game?
Noone is gonna give you some groundbreaking tech for your electronic gadget.... As IBM showed when they created the Cell for Sony and then gave almost the same tech to Microsoft :D.
I'm just saying no sane company gonna give you any edge in chiptech.
I really hope that this doesn't come to pass. It's all in on the two worst trends in graphics right now. Hardware Raytracing and AI based upscaling.