amlib 3 days ago

Could the PS6 be the last console generation with an expressive improvement in compute and graphics? Miniaturization keeps giving ever more diminishing returns each shrink, prices of electronics are going up (even sans tariffs), lead by the increase in the price of making chips. Alternate techniques have slowly been introduced to offset the compute deficit, first with post processing AA in the seventh generation, then with "temporal everything" hacks (including TAA) in the previous generation and finally with minor usage of AI up-scaling in the current generation and (projected) major usage of AI up-scaling and frame-gen in the next gen.

However, I'm pessimistic on how this can keep evolving. RT already takes a non trivial amount of transistor budget and now those high end AI solutions require another considerable chunk of the transistor budget. If we are already reaching the limits of what non generative AI up-scaling and frame-gen can do, I can't see where a PS7 can go other than using generative AI to interpret a very crude low-detail frame and generating a highly detailed photorealistic scene from that, but that will, I think, require many times more transistor budget than what will likely ever be economically achievable for a whole PS7 system.

Will that be the end of consoles? Will everything move to the cloud and a power guzzling 4KW machine will take care of rendering your PS7 game?

I really can only hope there is a break-trough in miniaturization and we can go back to a pace of improvement that can actually give us a new generation of consoles (and computers) that makes the transition from an SNES to a N64 feel quaint.

  • Loic 3 days ago

    My kids are playing Fortnite on a PS4, it works, they are happy, I feel the rendering is really good (but I am an old guy) and normally, the only problem while playing is the stability of the Internet connection.

    We also have a lot of fun playing board games, simple stuff from design, card games, here, the game play is the fun factor. Yes, better hardware may bring more realistic, more x or y, but my feeling is that the real driver, long term, is the quality of the game play. Like the quality of the story telling in a good movie.

    • amlib 2 days ago

      Yes, that's something I failed to address in my post. I myself have also been happier playing older or just simpler games than chasing the latest AAA with cutting edge graphics.

      What I see as a problem though is that the incumbent console manufacturers, sans Nintendo, have been chasing graphical fidelity since time immemorial as the main attraction for new generations of consoles and may have a hard time convincing buyers to purchase a new system once they can't irk out expressive gains in this area. Maybe they will successfully transition into something more akin to what Nintendo does and focus on delivering killer apps, gimmicks and other innovations every new generation.

      Or perhaps they will slowly fall into irrelevance and everything will converge into PC/Steam (I doubt Microsoft can pull off whatever plan they have for the future of xbox) and any half-decent computer can run any game for decades to come and Gabe Newell becomes the richest person in the world.

    • LarsDu88 3 days ago

      Every generation thinks the current generation of graphics won't be topped, but I think you have no idea what putting realtime generative models into the rendering pipeline will do for realism. We will finally get rid of the uncanny valley effect with facial rendering, and the results will almost certainly be mindblowing.

      • flohofwoe 2 days ago

        Every generation also thinks that the uncanny valley will be conquered in the next generation ;)

        The quest for graphical realism in games has been running against a diminishing-returns-wall for quite a while now (see hardware raytracing - all that effort for slightly better reflections and shadows, yay?), what we need most right now is more risk-taking in gameplay by big budget games.

      • Rover222 3 days ago

        I think the inevitable near future is that games are not just upscaled by AI, but they are entirely AI generated in realtime. I’m not technical enough to know what this means for future console requirements, but I imagine if they just have to run the generative model, it’s… less intense than how current games are rendered for equivalent results.

    • flyinglizard 3 days ago

      That's the Nintendo way. Avoiding the photorealism war altogether by making things intentionally sparse and cartoony. Then you can sell cheap hardware, make things portable etc.

      • tonyhart7 3 days ago

        also nintendo vision which is "mobile gaming" are

        handheld devices like switch,steam deck etc is really the future while phone is also true for some extend but gaming on a phone vs gaming on a handheld is really world of a differences

        give it few generations then traditional consoles would obsolete, I mean we are literally have a lot of people enjoy indie game in steam deck right now

      • xiande04 3 days ago

        I.e., the uncanny valley.

        • gyomu 3 days ago

          Cartoony isn’t the uncanny valley. Uncanny valley is attempted photorealism that misses the mark.

    • pipes 3 days ago

      Unreal engine 1 looks good to me, so I am not a good judge.

      I keep thinking there is going to be a video game crash soon, over saturation of samey games. But I'm probably wrong about that. I just think that's what Nintendo had right all along: if you commoditize games, they become worthless. We have endless choice of crap now.

      In 1994 at age 13 I stopped playing games altogether. Endless 2d fighters and 2d platformer was just boring. It would take playing wave race and golden eye on the N64 to drag me back in. They were truly extraordinary and completely new experiences (me and my mates never liked doom). Anyway I don't see this kind of shift ever happening again. Infact talking to my 13 year old nephew confirms what I (probably wrongly) believe, he's complaining there's nothing new. He's bored or fortnight and mine craft and whatever else. It's like he's experiencing what I experienced, but I doubt a new generation of hardware will change anything.

      • dehrmann 3 days ago

        > Unreal engine 1 looks good to me, so I am not a good judge.

        But we did hit a point where the games were good enough, and better hardware just meant more polygons, better textures, and more lighting. The issues with Unreal Engine 1 (or maybe just games of that era) was that the worlds were too sparse.

        > over saturation of samey games

        So that's the thing. Are we at a point where graphics and gameplay in 10-year-old games is good enough?

      • teamonkey 3 days ago

        I get so sad when I hear people say there’s no new games. There are so many great, innovative games being made today, more than any time in history. There are far more great games on Steam than anyone can play in a lifetime.

        Even AAAs aim to create new levels of spectacle (much like blockbuster movies), even if they don’t innovate on gameplay.

        The fatigue is real (and I think it’s particularly bad for this generation raised to spend all their gaming time inside the big 3), but there’s something for you out there, the problem is discoverability, not a lack of innovation.

        • rowanG077 2 days ago

          This so much. Anyone that's saying games used to be better is either not looking or has lost their sight to nostalgia.

      • tonyhart7 3 days ago

        "if you commoditize games, they become worthless"

        ???? hmm wrong??? if everyone can make game, the floor is raising making the "industry standard" of a game is really high

        while I agree with you that if everything is A then A is not meaning anything but the problem is A isn't vanish, they just moved to another higher tier

        • pipes 2 days ago

          You probably have a point and it's not something I believe completely. My main problem I think is I have seen nothing new in games for 20 years at least.

          Gunpei yokoi said something similar here:

          https://shmuplations.com/yokoi/

          Yokoi: When I ask myself why things are like this today, I wonder if it isn’t because we’ve run out of ideas for games. Recent games take the same basic elements from older games, but slap on characters, improve the graphics and processing speed… basically, they make games through a process of ornamentation.

  • Uvix 3 days ago

    It sounds like even the PS6 isn’t going to have an expressive improvement, and that the PS5 was the last such console. PS5 Pro was the first console focused on fake frame generation instead of real output resolution/frame rate improvements, and per the article PS6 is continuing that trend.

    • PaulHoule 3 days ago

      What really matters is the cost.

      In the past a game console might launch at a high price point and then after a few years, the price goes down and they can release a new console at a high at a price close to where the last one started.

      Blame crypto, AI, COVID but there has been no price drop for the PS5 and if there was gonna be a PS6 that was really better it would probably have to cost upwards of $1000 and you might as well get a PC. Sure there are people who haven’t tried Steam + an XBOX controller and think PV gaming is all unfun and sweaty but they will come around.

      • Retric 3 days ago

        Inflation. PS5 standard at $499 in 2019 is $632 in 2025 money which is the same as the 1995 PS 1 when adjusted for inflation $299 (1995) to $635(2025). https://www.usinflationcalculator.com/

        Thus the PS6 should be around 699 at launch.

      • dangus 3 days ago

        But now you’re assuming the PC isn’t also getting more expensive.

        If a console designed to break even is $1,000 then surely an equivalent PC hardware designed to be profitable without software sales revenue will be more expensive.

      • Uvix 3 days ago

        As long as I need a mouse and keyboard to install updates or to install/start my games from GOG, it's still going to be decidedly unfun, but hopefully Windows' upcoming built-in controller support will make it less unfun.

      • greenavocado 3 days ago

        How many grams of gold has the PS cost at launch using gold prices on launch day

        • ssl-3 3 days ago

          If I'm doing this right, then:

          PS1: 24.32 grams at launch

          PS5 (disc): 8.28 grams at launch

          (So I guess that if what one uses for currency is a sock drawer full of gold, then consoles have become a lot cheaper in the past decades.)

      • cyanydeez 3 days ago

        Im still watching 720p movirs, video games.

        Somewhere between 60 hz and 240hz, theres zero fundamental benefits. Same for resolution.

        It isnt just that hardware progress is a sigmoid, our experiential value.

        The reality is that exponential improvement is not a fundamental force. Its always going to find some limit.

    • ZiiS 3 days ago

      Really strange that a huge pile of hacks, maths, and more hacks became the standard of "true" frames.

  • crote 3 days ago

    Consoles are the perfect platform for a proper pure ray tracing revolution.

    Ray tracing is the obvious path towards perfect photorealistic graphics. The problem is that ray tracing is really expensive, and you can't stuff enough ray tracing hardware into a GPU which can also run traditional graphics for older games. This means games are forced to take a hybrid approach, with ray tracing used to augment traditional graphics.

    However, full-scene ray tracing has essentially a fixed cost: the hardware needed depends primarily on the resolution and framerate, not the complexity of the scene. Rendering a million photorealistic objects is not much more compute-intensive than rendering a hundred cartoon objects, and without all the complicated tricks needed to fake things in a traditional pipeline any indie dev could make games with AAA graphics. And if you have the hardware for proper full-scene raytracing, you no longer need the whole AI upscaling and framegen to fake it...

    Ideally you'd want a GPU which is 100% focused on ray tracing and ditches the entire legacy triangle pipeline - but that's a very hard sell in the PC market. Consoles don't have that problem, because not providing perfect backwards compatibility for 20+ years of games isn't a dealbreaker there.

    • Aurornis 3 days ago

      > Rendering a million photorealistic objects is not much more compute-intensive than rendering a hundred cartoon objects

      Increasing the object count by that many orders of magnitude is definitely much more compute intensive.

      • reactordev 3 days ago

        Only if you have more than 1 bounce. Otherwise it’s the same. You’ll cast a ray and get a result.

    • khalladay 3 days ago

      > Rendering a million photorealistic objects is not much more compute-intensive than rendering a hundred cartoon objects

      Surely ray/triangle intersection tests, brdf evaluation, acceleration structure rebuilds (when things move/animate) all would cost more in your photorealistic scenario than the cartoon scenario?

      • reactordev 3 days ago

        Matrix multiplication is all that is and GPUs are really good at doing that in parallel already.

    • thfuran 2 days ago

      >Consoles don't have that problem, because not providing perfect backwards compatibility for 20+ years of games isn't a dealbreaker there.

      I'm not sure that's actually true for Sony. You can currently play several generations of games on the PS5, and I think losing that on PS6 would be a big deal to a lot of people.

      • kevincox 2 days ago

        Maybe they can pull the old console trick of just including a copy of the old hardware inside the new console.

        However I suspect that this isn't as cost and space effective as it used to be.

    • cubefox 3 days ago

      Combining both ray tracing (including path tracing, which is a form of ray tracing) and rasterization is the most effective approach. The way it is currently done is that primary visibility is calculated using triangle rasterization, which produces perfectly sharp and noise free textures, and then the ray traced lighting (slightly blurry due to low sample count and denoising) is layered on top.

      > However, full-scene ray tracing has essentially a fixed cost: the hardware needed depends primarily on the resolution and framerate, not the complexity of the scene.

      That's also true for modern rasterization with virtual geometry. Virtual geometry keeps the number of rendered triangles roughly proportional to the screen resolution, not to the scene complexity. Moreover, virtual textures also keep the amount of texture detail in memory roughly proportional to the screen resolution.

      The real advantage of modern ray tracing (ReSTIR path tracing) is that it is independent of the number of light sources in the scene.

    • newsclues 3 days ago

      So create a system RT only GPU plus a legacy one for the best of both worlds?

  • JoshTriplett 3 days ago

    After raytracing, the next obvious massive improvement would be path tracing.

    And while consoles usually lag behind the latest available graphics, I'd expect raytracing and even path tracing to become available to console graphics eventually.

    One advantage of consoles is that they're a fixed hardware target, so games can test on the exact hardware and know exactly what performance they'll get, and whether they consider that performance an acceptable experience.

    • winterismute 3 days ago

      There is no real difference between "Ray Tracing" and "Path Tracing", or better, the former is just the operation of intersecting a ray with a scene (and not a rendering technique), the latter is a way to solve the integral to approximate the rendering equation (hence, it could be considered a rendering technique). Sure, you can go back to the terminology used by Kajiya in his earlier works etc etc, but it was only a "academic terminology game" which is worthless today. Today, the former is accelerated by HW since around a decade (I am cunting the PowerVR wizard). The latter is how most of non-realtime rendering renders frames.

      You can not have "Path Tracing" in games, not according to what it is. And it also probably does not make sense, because the goal of real-time rendering is not to render the perfect frame at any time, but it is to produce the best reactive, coherent sequence of frames possible in response to simulation and players inputs. This being said, HW ray tracing is still somehow game changing because it shapes a SIMT HW to make it good at inherently divergent computation (eg. traversing a graph of nodes representing a scene): following this direction, many more things will be unlocked in real-time simulation and rendering. But not 6k samples unidirectionally path-traced per pixel in a game.

  • Keyframe 3 days ago

    not all games need horse power. We've now past the point of good enough to run a ton of it. Sure, tentpole attractions will warrant more and more, but we're turning back to mechanics, input methods, gameplay, storytelling. If you play 'old' games now, they're perfectly playable. Just like older movies are perfectly watchable. Not saying you should play those (you should), but there's not kuch of a leap needed to keep such ideas going strong and fresh.

    • ad133 3 days ago

      This is my take as well. I haven’t felt that graphics improvement has “wowed” me since the PS3 era honestly.

      I’m a huge fan of Final Fantasy games. Every mainline game (those with just a number; excluding 11 and 14 which are MMOs) pushes the graphical limits of the platforms at the time. The jump from 6 to 7 (from SNES to PS1); from 9 to 10 (PS1 to 2); and from 12 to 13 (PS3/X360) were all mind blowing. 15 (PS4) and 16 (PS5) were also major improvements in graphics quality, but the “oh wow” generational gap is gone.

      And then I look at the gameplay of these games, and it’s generally regarded as going in the opposite direction- it’s all subjective of course but 10 is generally regarded as the last “amazing” overall game, with opinions dropping off from there.

      We’ve now reached the point where an engaging game with good mechanics is way more important than graphics: case in point being Nintendo Switch, which is cheaper and has much worse hardware, but competes with the PS5 and massively outsells Xbox by huge margins, because the games are fun.

      • musicale 3 days ago

        FF12 and FF13 are terrific games that have stood the test of time.

        And don't forget the series of MMOs:

        FF11 merged Final Fantasy with old-school MMOs, notably Everquest, to great success.

        FF14 2.0 was literally A Realm Reborn from the ashes of the failed 1.0, and was followed by the exceptional Heavensward expansion.

        FF14 Shadowbringers was and is considered great.

  • dehrmann 3 days ago

    > non generative AI up-scaling

    I know this isn't an original idea, but I wonder if this will be the trick for step-level improvement in visuals. Use traditional 3D models for the broad strokes and generative AI for texture and lighting details. We're at diminishing returns for add polygons and better lighting, and generative AI seems to be better at improving from there—when it doesn't have to get the finger count right.

  • [removed] 3 days ago
    [deleted]
  • ClimaxGravely 3 days ago

    I'd hesitate to call the temporal hacks progress. I disable them every time.

  • jayd16 3 days ago

    There's likely still room to go super wide with CPU cores and much more ram but everyone is talking about neutral nets so that's what the press release is about.

  • bob1029 3 days ago

    Gaming using weird tech is not a hardware manufacturer or availability issue. It is a game studio leadership problem.

    Even in the latest versions of unreal and unity you will find the classic tools. They just won't be advertised and the engine vendor might even frown upon them during a tech demo to make their fancy new temporal slop solution seem superior.

    The trick is to not get taken for a ride by the tools vendors. Real time lights, "free" anti aliasing, and sub-pixel triangles are the forbidden fruits of game dev. It's really easy to get caught up in the devil's bargain of trading unlimited art detail for unknowns at end customer time.

  • EasyMark 3 days ago

    doubtful, they say this with every generation of console and even gaming pc systems. When it's popularity decreases then profits decrease and then maybe it will be "the last generation".

  • dataangel 3 days ago

    they can't move everything to the cloud because of latency

  • xiande04 3 days ago

    It's not just technology that's eating away at console sales, it's also the fact that 1) nearly everything is available on PC these days (save Nintendo with its massive IP), 2) mobile gaming, and 3) there's a limitless amount of retro games and hacks or mods of retro games to play and dedicated retro handhelds are a rapidly growing market. Nothing will ever come close to PS2 level sales again. Will be interesting to see how the video game industry evolves over the next decade or two. I suspect subscriptions (sigh) will start to make up for lost console sales.

    • tonyhart7 3 days ago

      "Nothing will ever come close to PS2 level sales again."

      ps2 sales number is iffy at very least, also ps2 sales has been dethrone "few times" quotation mark since when nintendo sales is creeping up, sony announced there are "few millions sales" added while they already didnt produce them years ago

    • theshackleford 3 days ago

      > Nothing will ever come close to PS2 level sales again.

      The switch literally has and according to projections the Switch 1 will in fact have outsold the PS2 globally by the end of the year.

  • Mistletoe 3 days ago

    Welcome to the Age of the Plateau. It will change everything we know. Invest accordingly.

  • aurareturn 3 days ago

    Beyond the PS6, the answer is very clearly graphics generated in real time via a transformer model.

    I’d be absolutely shocked if in 10 years, all AAA games aren’t being rendered by a transformer. Google’s veo 3 is already extremely impressive. No way games will be rendered through traditional shaders in 2035.

    • wartywhoa23 3 days ago

      The future of gaming is the Grid-Independent Post-Silicon Chemo-Neural Convergence, the user will be injected with drugs designed by AI based on a loose prompt (AI generated as well, because humans have long lost the ability to formulate their intent) of the gameplay trip they must induce.

      Now that will be peak power efficiency and a real solution for the world where all electricity and silicon are hogged by AI farms.

      /s or not, you decide.

      • pavlov 3 days ago

        Stanislaw Lem’s “The Futurological Congress” predicted this in 1971.

        • wartywhoa23 3 days ago

          FYI it's got an amazing film adaptation by Ari Folman in his 2013 "The Congress". The most emotionally striking film I've ever watched.

      • speed_spread 3 days ago

        There will be a war between these biogamers and smart consoles that can play themselves.

    • lm28469 3 days ago

      Is this before or after fully autonomous cars and agi? Both should be there in two years right?

      10 years ago people were predicting VR would be everywhere, it flopped hard.

      • aurareturn 3 days ago

        I've been riding Waymo for years in San Francisco.

        10 years ago, people were predicting that deep learning will change everything. And it did.

        Why just use one example (VR) and apply it to everything? Even then, a good portion of people did not think VR would be everywhere by now.

      • wartywhoa23 3 days ago

        It did flop, but still a hefty loaf of money was sliced off in the process.

        Those with the real vested interest don't care if that flops, while zealous worshippers to the next brand new disruptive tech are just a free vehicle to that end.

      • kranke155 3 days ago

        VR is great industrial tech and bad consumer tech. It’s too isolating for consumers.

    • MarCylinder 3 days ago

      Just because it's possible doesn't mean it is clearly the answer. Is a transformer model truly likely to require less compute than current methods? We can't even run models like Veo 3 on consumer hardware at their current level of quality.

      • aurareturn 2 days ago

        I’d imagine AAA games will evolve to hundreds of billions of polygons and full path tracing. There is no realistic way to compute a scene like that on consumer hardware.

        The answer is clearly transformer based.

    • fidotron 3 days ago

      Transformer maybe not, but neural net yes. This is profoundly uncomfortable for a lot of people, but it's the very clear direction.

      The other major success of recent years not discussed much so far is gaussian splats, which tear up the established production pipeline again.

      • aurareturn 3 days ago

        Neural net is already being used via DLSS. Neural rendering is the next step. And finally, a full transformer based rendering pipeline. My guess anyway.

    • CuriouslyC 3 days ago

      That's just not efficient. AAA games will use AI to pre-render assets, and use AI shaders to make stuff pop more, but on the fly asset generation will still be slow and produce low quality compared to offline asset generation. We might have a ShadCN style asset library that people use AI to tweak to produce "realtime" assets, but there will always be an offline core of templates at the very least.

      • aurareturn 2 days ago

        It is likely a hell of a lot more efficient than path tracing a full ultra realistic game with billions of polygons.

    • Certhas 3 days ago

      This _might_ be true, but it's utterly absurd to claim this is a certainty.

      The images rendered in a game need to accurately represent a very complex world state. Do we have any examples of Transformer based models doing something in this category? Can they do it in real-time?

      I could absolutely see something like rendering a simplified and stylised version and getting Transformers to fill in details. That's kind of a direct evolution from the upscaling approach described here, but end to end rendering from game state is far less obvious.

      • kgdiem 3 days ago

        Doesn’t this imply that a transformer or NN could fill in details more efficiently than traditional techniques?

        I’m really curious why this would be preferable for a AAA studio game outside of potential cost savings. Also imagine it’d come at the cost of deterministic output / consistency in visuals.

      • aurareturn 3 days ago

          I could absolutely see something like rendering a simplified and stylised version and getting Transformers to fill in details. That's kind of a direct evolution from the upscaling approach described here, but end to end rendering from game state is far less obvious.
        
        Sure. This could be a variation. You do a quick render that any GPU from 2025 can do and then make the frame hyper realistic through a transformer model. It's basically saying the same thing.

        The main rendering would be done by the transformer.

        Already in 2025, Google Veo 3 is generating pixels far more realistic than AAA games. I don't see why this wouldn't be the default rendering mode for AAA games in 2035. It's insanity to think it won't be.

        Veo3: https://aistudio.google.com/models/veo-3

      • mdale 3 days ago

        Genie 3 is already a frontier approach to interactive generative world views no?

        It will be AI all the way down soon. The models internal world view could be multiple passes and multi layer with different strategies... In any case; safe to say more AI will be involved in more places ;)

        • Certhas 3 days ago

          I am super intrigued by such world models. But at the same time it's important to understand where they are at. They are celebrating the achievement of keeping the world mostly consistent for 60 seconds, and this is 720p at 24fps.

          I think it's reasonable to assume we won't see this tech replace game engines without significant further breakthroughs...

          For LLMs agentic workflows ended up being a big breakthrough to make them usable. Maybe these World Models will interact with a sort of game engine directly somehow to get the required consistency. But it's not evident that you can just scale your way from "visual memory extending up to one minute ago" to 70+ hour game experiences.

    • KeplerBoy 3 days ago

      Be prepared to be shocked. This industry moves extremely slow.

      • aurareturn 2 days ago

        They'll have to move fast when a small team can make graphically richer game than a big and slow AAA studio.

        Competition works wonders.

magicalhippo 3 days ago

I was going to say "again?", but then I recalled DirectX 12 was released 10 years ago and now I feel old...

The main goal of Direct3D 12, and subsequently Vulcan, was to allow for better use of the underlying graphics hardware as it had changed more and more from its fixed pipeline roots.

So maybe the time is ripe for a rethink, again.

Particularly the frame generation features, upscaling and frame interpolation, have promise but needs to be integrated in a different way I think to really be of benefit.

  • pjmlp 3 days ago

    The rethink is already taking place via mesh shaders and neural shaders.

    You aren't seeing them adopted that much, because the hardware still isn't deployed at scale that games can count on them being available, and also it cannot ping back on improving the developer experience adopting them.

    • ksec 2 days ago

      How far or any examples for neural shaders? I try to search for it but 90% of all results are AI generated nonsense.

  • Hikikomori 3 days ago

    Don't forget mantle.

    • scns 3 days ago

      Did not Mantle become Vulkan?

      • flohofwoe 3 days ago

        Yeah but that doesn't mean that much of Mantle is recognizeable in Vulkan, because Vulkan wanted to cover the entire range of GPU architectures (including outdated and mobile GPUs) with a single API, while Mantle was designed for modern (at the time) desktop GPUs (and specifically AMD GPUs). Vulkan basically took an elegant design and "ruined" it with too much real-word pragmatism ;)

    • magicalhippo 3 days ago

      While I didn't forget about it, I did misremember the timeline. So yea, Mantle should definitely be mentioned.

  • UltraSane 3 days ago

    I remember reading about directx 1 in PC Gamer magazine

  • [removed] 3 days ago
    [deleted]
poisonborz 3 days ago

The industry, and at large the gaming community is just long past being interested in graphics advancement. AAA games are too complicated and expensive, the whole notion of ever more complex and grandiose experiences doesn't scale. Gamers are fractured along thousands of small niches, even in sense of timeline in terms of 80s, 90s, PS1 era each having a small circle of businesses serving them.

The times of console giants, their fiefdoms and the big game studios is coming to an end.

  • seanalltogether 3 days ago

    I'll take the other side of this argument and state that people are interested in higher graphics, BUT they expect to see an equally higher simulation to go along with it. People aren't excited for GTA6 just because of the graphics, but because they know the simulation is going to be better then anything they've seen before. They need to go hand in hand.

    • jesse__ 3 days ago

      That's totally where all this is going. More horsepower on a GPU doesn't necessarily mean it's all going towards pixels on the screen. People will get creative with it.

    • nitwit005 2 days ago

      I'm almost certain that we'll see comments that GTA6 feels like a downgrade to big GTA5 fans, as there was a decade of content created for the online version of GTA5.

  • rafaelmn 3 days ago

    I disagree - current gen console aren't enough to deliver smooth immersive graphics - I played BG3 on PS first and then on PC and there's just no comparing the graphics. Cyberpunk same deal. I'll pay to upgrade to consistent 120/4k and better graphics, and I'll buy the games.

    And there are AAA that make and will make good money with graphics being front and center.

    • Ntrails 3 days ago

      >aren't enough to deliver smooth immersive graphics

      I'm just not sold.

      Do I really think that BG3 being slightly prettier than, say, Dragon Age / Skyrim / etc made it a more enticing game? Not to me certainly. Was cyberpunk prettier than Witcher 3? Did it need to be for me to play it?

      My query isn't about whether you can get people to upgrade to play new stuff (always true). But whether they'd still upgrade if they could play on the old console with worse graphics.

      I also don't think anyone is going to suddenly start playing video games because the graphics improve further.

      • rafaelmn 3 days ago

        > Do I really think that BG3 being slightly prettier than, say, Dragon Age / Skyrim / etc made it a more enticing game?

        Absolutely - graphical improvements make the game more immersive for me and I don't want to go back and replay the games I spent hundreds of hours in mid two thousands, like say NVN or Icewind Dale (never played BG 2). It's just not the same feeling now that I've played games with incomparable graphics, polished mechanics and movie level voice acting/mocap cutscenes. I even picked up Mass Effect recently out of nostalgia but gave up fast because it just isn't as captivating as it was back when it was peak graphics.

      • keyringlight 3 days ago

        Two aspects I keep thinking about:

        -How difficult it must be for the art/technical teams at game studios to figure out for all the detail they are capable of putting on screen how much of it will be appreciated by gamers. Essentially making sure that anything they're going to be budgeting significant amount of worker time to creating, gamers aren't going to run right past it and ignore or doesn't contribute meaningfully to 'more than the sum of its parts'.

        -As much as technology is an enabler for art, alongside the install base issue how well does pursuing new methods fit how their studio is used to working, and is the payoff there if they spend time adapting. A lot of gaming business is about shipping product, and the studios concern is primarily about getting content to gamers than chasing tech as that is what lets their business continue, selling GPUs/consoles is another company's business.

    • pjmlp 3 days ago

      Being an old dog that still cares about gaming, I would assert many games are also not taking advantage of current gen hardware, coded in Unreal and Unity, a kind of Electron for games, in what concerns taking advantage of existing hardware.

      There is a reason there are so many complaints in social media about being obvious to gamers in what game engine a game was written on.

      It used to be that game development quality was taken more seriously, when they were sold via storage media, and there was a deadline to burn those discs/cartridges.

      Now they just ship whatever is done by the deadline, and updates will come later via a DLC, if at all.

      • jayd16 3 days ago

        They're both great engines. They're popular and gamers will lash out at any popular target.

        If it was so simple to bootstrap an engine no one would pay the percentage points to Unity and Epic.

        The reality is the quality bar is insanely high.

        • gyomu 3 days ago

          It is pretty simple to bootstrap an engine. What isn’t simple is supporting asset production pipelines on which dozen/hundreds of people can work on simultaneously, and on which new hires/contractors can start contributing right away, which is what modern game businesses require and what unity/unreal provide.

      • formerly_proven 3 days ago

        Unreal and Unity would be less problematic if these engines were engineered to match the underlying reality of graphics APIs/drivers, but they're not. Neither of these can systematically fix the shader stuttering they are causing architecturally, and so essentially all games built on these platforms are sentenced to always stutter, regardless of hardware.

        Both of these seem to suffer from incentive issues similar to enterprise software: They're not marketing and selling to either end users or professionals, but studio executives. So it's important to have - preferably a steady stream of - flashy headline features (e.g. nanite, lumen) instead of a product that actually works on the most basic level (consistently render frames). It doesn't really matter to Epic Games that UE4/5 RT is largely unplayable; even for game publishers, if you can pull nice-looking screenshots out of the engine or do good-looking 24p offline renders (and slap "in-game graphics" on them), that's good enough.

    • flohofwoe 3 days ago

      Just get a PC then? ;) In the end, game consoles haven't been much more than "boring" subsidized low-end PCs for quite a while now.

      • rafaelmn 3 days ago

        PC costs a lot and depreciates fast, by the end of a console lifecycle I can still count on developers targeting it - PC performance for 6+ year hardware is guaranteed to suck. And I'm not a heavy gamer - I'll spend ~100h on games per year, but so will my wife and my son - PC sucks for multiple people using it - PS is amazing. I know I could concoct some remote play setup via lan on TV to let my wife and kids play but I just want something I spend a few hundred eur and I plug into the TV and then it works.

        Honestly the only reason I caved with the GPU purchase (which cost the equivalent of a PS pro) was the local AI - but in retrospect that was useless as well.

    • gnulinux996 3 days ago

      > current gen console aren't enough to deliver smooth immersive graphics

      The Last of Us franchise, especially part 2 have been the most immersive experiences that I have had in gaming.

      This game pretty much told me that the PlayStation is more than capable of delivering this kind of experiences.

      Now, if some of those high budget so-called AAA games cannot deliver not even a fraction of that - I believe - is on them.

    • wiseowise 3 days ago

      > current gen console aren't enough to deliver smooth immersive graphics

      They were enough since PS4 era to deliver smooth, immersive graphics.

  • pornel 3 days ago

    Advancements in lighting can help all games, not just AAA ones.

    For example, Tiny Glade and Teardown have ray traced global illumination, which makes them look great with their own art style, rather than expensive hyper-realism.

    But currently this is technically hard to pull off, and works only within certain constrained environments.

    Devs are also constrained by the need to support multiple generations of GPUs. That's great from perspective of preventing e-waste and making games more accessible. But technically it means that assets/levels still have to be built with workarounds for rasterized lights and inaccurate shadows. Simply plugging in better lighting makes things look worse by exposing the workarounds, while also lacking polish for the new lighting system. This is why optional ray tracing effects are underwhelming.

  • goalieca 3 days ago

    Nintendo dominated last generation with switch. The games were only HD and many at 30fps. Some AAA didn't even get ported to them. But they sold a ton of units and a ton of games and few complained because they were having fun which is what gaming is all about anyways.

    • Fire-Dragon-DoL 3 days ago

      That is a different audience than people playing on pc/xbox/ps5. Although arguably each console has a different audience, so there is that.

      • theshackleford 3 days ago

        > That is a different audience than people playing on pc/xbox/ps5.

        Many PC users also own a switch. It is in fact one of the most common pairings. There is very little I want get on PC from PS/Xbox so very little point in owning one, I won't get any of the Nintendo titles so keeping one around makes significantly more sense if I want to cover my bases for exclusives.

        • Fire-Dragon-DoL 3 days ago

          I agree, but bases for exclusives it's one way to differentiate an audience. I literally don't like any game nintendo makes except maybe for Zelda,I wouldn't buy a switch just for that though. I do have a Switch because I have kids though.

  • b_e_n_t_o_n 3 days ago

    idk, battlefield 6 came out today to very positive reviews and it's absolutely gorgeous.

    • jimaek 3 days ago

      It's fine, but definitely a downgrade compared to previous titles like Battlefield 1. At moments it looks pretty bad.

      I'm curious why graphics are stagnating and even getting worse in many cases.

      • flohofwoe 3 days ago

        Exploding production cost is pretty much the only reason (eg we hit diminishing returns in overall game asset quality vs production cost at least a decade ago) plus on the tech side a brain drain from rendering tech to AI tech (or whatever the current best-paid mega-hype is). Also, working in gamedev simply isn't "sexy" anymore since it has been industrialized to essentially assembly line jobs.

        • teamonkey 3 days ago

          It’s far from an assembly line job, but it’s unstable, challenging and the pay hasn’t kept up with the rest of the tech sector.

      • b_e_n_t_o_n 3 days ago

        Have you played it? I haven't so I'm just basing my opinion on some YouTube footage I've seen.

        BF1 is genuinely gorgeous, I can't lie. I think it's the photogrammetry. Do you think the lighting is better in BF1? I'm gonna go out on a limb and say that BF6's lighting is more dynamic.

        • jimaek 3 days ago

          Yes I played it on a 4090. The game is good but graphics are underwhelming.

          To my eyes everything looked better in BF1.

          Maybe it's trickery but it doesn't matter to me. BF6, new COD, and other games all look pretty bad. At least compared to what I would expect from games in 2025.

          I don't see any real differences from similar games released 10 years ago.

    • ksec 3 days ago

      It looks like Frostbite 4.0 is so much better than Unreal 5.x. I cant wait to see comparison.

pixelpoet 3 days ago

Teenage me from the 90s telling everyone that ray tracing will eventually take over all rendering and getting laughed at would be happy :)

  • Sesse__ 3 days ago

    It's not, though. The use of RT in games is generally limited to secondary rays; the primaries are still rasterized. (Though the rasterization is increasingly done in “software rendering”, aka compute shaders.)

    • pixelpoet 3 days ago

      As you can tell, I'm patient :) A very important quality for any ray tracing enthusiast lol

      The ability to do irregular sampling, efficient shadow computation (every flavour of shadow mapping is terrible!) and global illumination is already making its way into games, and path tracing has been the algorithm of choice in offline rendering (my profession since 2010) for quite a while already.

      Making a flexible rasterisation-based renderer is a huge engineering undertaking, see e.g. Unreal Engine. With the relentless march of processing power, and finally having hardware acceleration as rasterisation has enjoyed for decades, it's going to be possible for much smaller teams to deliver realistic and creative (see e.g. Dreams[0]) visuals with far less engineering effort. Some nice recent examples of this are Teardown[1] and Tiny Glade[2].

      It's even more inevitable from today's point of view than it was back in the 90s :)

      [0] Dreams: https://www.youtube.com/watch?v=u9KNtnCZDMI

      [1] Teardown: https://teardowngame.com/

      [2] Tiny Glade: https://www.youtube.com/watch?v=jusWW2pPnA0

  • prox 3 days ago

    Hi teenage you! You did well :)

    The idea of the radiance cores is pretty neato

    • ksec 3 days ago

      >radiance cores is pretty nea

      I still dont understand how it is different to Nvidia's RT Core.

      • jsheard 3 days ago

        AFAICT it's not really different, they're just calling it something else for marketing reasons. The system described in the Sony patent (having a fixed-function unit traverse the BVH asynchronously from the shader cores) is more or less how Nvidia's RT cores worked from the beginning, as opposed to AMDs early attempts which accelerated certain intersection tests but still required the shader cores to drive the traversal loop.

  • nightfly 3 days ago

    I wonder if we'll ever get truly round objects in my lifetime though

    • phkahler 3 days ago

      My old ray tracer could do arbitrary quadric surfaces, toroids with 2 minor radii, and CSG of all those. Triangles too (no CSG). It was getting kind of fast 20 years ago - 10fps at 1024x768. Never had good shading though.

      I should dig that up and add NURBS and see how it performs today.

    • csmoak 3 days ago

      dreams on playstation and unbound on pc both use sdfs to allow users to make truly round objects for games

three_burgers 3 days ago

It feels like each time SCE makes a new console, it'd always come with some novelty that's supposed to change the field forever, but after two years they'd always end up just another console.

  • jpalawaga 3 days ago

    You end up with a weird phenomenon.

    Games written for the PlayStation exclusively get to take advantage of everything, but there is nothing to compare the release to.

    Alternatively, if a game is release cross-platform, there’s little incentive to tune the performance past the benchmarks of comparable platforms. Why make the PlayStation game look better than Xbox if it involves rewriting engine layer stuff to take advantage of the hardware, for one platform only.

    Basically all of the most interesting utilization of the hardware comes at the very end of the consoles lifecycle. It’s been like that for decades.

    • three_burgers 3 days ago

      I think apart from cross-platform woes (if you can call it that), it's also that the technology landscape would shift, two or few years after the console's release:

      For PS2, game consoles didn't become the centre of home computing; for PS3, programming against the GPU became the standard of doing real time graphics, not some exotic processor, plus that home entertaining moved on to take other forms (like watching YouTube on an iPad instead of having a media centre set up around the TV); for PS4, people didn't care if the console does social networking; PS5 has been practical, it's just the technology/approach ended up adopted by everyone, so it lost its novelty later on.

      • ffsm8 3 days ago

        You got a very "interesting" history there, it certainly not particularly grounded in reality however.

        PS3s edge was generally seen as the DVD player.

        That's why Sony went with Blue Ray in the PS4, hoping to capitalize on the next medium, too. While that bet didn't pay out, Xbox kinda self destructed, consequently making them the dominant player any way.

        Finally:

        > PS5 has been practical, it's just the technology/approach ended up adopted by everyone, so it lost its novelty later on.

        PS5 did not have any novel approach that was consequently adopted by others. The only thing "novel" in the current generation is frame generation, and that was already being pushed for years by the time Sony jumped on that bandwagon.

      • pjmlp 3 days ago

        That is very country specific, many countries home computers since the 8 bit days always dominated, whereas others consoles always dominated since Nintendo/SEGA days.

        • anthk 3 days ago

          Also tons of blue collar people bought Chinese NES clones even in mid 90's (at least in Spain) while some other people with white collar jobs bought their kids a Play Station. And OFC the Brick Game Tetris console was everywhere. By late 90's, yes, most people afforded a Play Station, but as for myself I've got a computer in very early 00's and I would emulate the PSX and most N64 games just fine (my computer wasn't a high end one, but the emulators were good enough to play the games at 640x480 and a bilinear filter).

    • ViscountPenguin 3 days ago

      I suspect it won't be as much of an issue next gen, with Microsoft basically dropping out of the console market.

      • awill 3 days ago

        3rd party games will still want to launch on the Nintendo Switch 2, so it's still the same problem.

      • ErneX 3 days ago

        They are definitely doing something but it seems it’s going to be more PC-like. Like even supporting 3rd party stores.

        I’m intrigued.

    • beagle3 3 days ago

      It’s also that way on the C64 - while it came out in 1981, people figures out how to get 8 bit sound and high resolution color graphics with multiple sprites only after 2000…

  • ericye16 3 days ago

    Maybe I ate too much marketing but it does feel like having the PS5 support SSDs raised the bar for how fast games are expected to load, even across platforms.

    • ThatPlayer 3 days ago

      Not just loading times, but I expect more games do more aggressive dynamic asset streaming. Hopefully we'll get less 'squeeze through this gap in the wall while we hide the loading of the next area of the map' in games.

      Technically the PS4 supported 2.5" SATA or USB SSDs, but yeah PS5 is first gen that requires SSDs, and you cannot run PS5 games off USB anymore.

  • noir_lord 3 days ago

    It does but I don't think that's necessarily a bad thing, they at least are willing to take some calculated risks about architecture - since consoles have essentially collapsed to been a PC internally.

    • three_burgers 3 days ago

      I don't think it's a bad thing either. Consoles are a curious breed in today's consumer electronics landscape, it's great that someone's still devoted to doing interesting experiments with it.

  • numpad0 3 days ago

    That was kind of true until Xbox 360 and later Unity, those ended eras of consoles as machines made of quirks as well as game design as primarily software architecture problems. The definitive barrier to entry for indie gamedevs before Unity was the ability to write a toy OS, a rich 3D engine, and GUI toolkit by themselves. Only little storytelling skills were needed.

    Console also partially had to be quirky dragsters because of Moore's Law - they had to be ahead of PC by years, because it had to be at least comparable to PC games at the end of lifecycle, not utterly obsolete.

    But we've all moved on. IMO that is a good thing.

  • [removed] 3 days ago
    [deleted]
sergiotapia 3 days ago

Graphics could stand to get toned down. It sucks to wait 7 years for a sequel to your favorite game. There was a time where sequels came out while the games were still relevant. We are getting sequels 8 years or more apart for what? Better beard graphics? Beer bottles where the liquid reacts when you bump into it? Who cares!

  | Game                                      | Release Year |
  |-------------------------------------------|--------------|
  | GTA III                                   | 2001         |
  | GTA Vice City                             | 2002         |
  | GTA San Andreas                           | 2004         |
  | Sly Cooper and the Thievius Raccoonus     | 2002         |
  | Sly 2: Band of Thieves                    | 2004         |
  | Sly 3: Honor Among Thieves                | 2005         |
  | Infamous                                  | 2009         |
  | Infamous 2                                | 2011         |
We are 5 full years into the PS5's lifetime. These are the only games that are exclusive to the console.

  | Game                                      | Release Year |
  |-------------------------------------------|--------------|
  | Astro's Playroom                          | 2020         |
  | Demon's Souls                             | 2020         |
  | Destruction AllStars                      | 2021         |
  | Gran Turismo 7                            | 2022         |
  | Horizon Call of the Mountain              | 2023         |
  | Firewall Ultra                            | 2023         |
  | Astro Bot                                 | 2024         |
  | Death Stranding 2: On the Beach           | 2025         |
  | Ghost of Yōtei                            | 2025         |
wejick 3 days ago

Funny that I thought the biggest improvement of PS5 is actually crazy fast storage. No loading screen is really gamechanger. I would love to get xbox instant resume on Playstation.

Graphic is nice but not number one.

  • Pulcinella 3 days ago

    The hardware 3D audio acceleration (basically fancy HRTFs) is also really cool, but almost no 3rd party games use it.

    I've had issues with Xbox instant resume. Lots of "your save file has changed since the last time you played, so we have to close the game and relaunch" issues. Even when the game was suspended an hour earlier. I assume it's just cloud save time sync issues where the cloud save looks newer because it has a timestamp 2 seconds after the local one. Doesn't fill me with confidence, though.

  • jesse__ 3 days ago

    Pretty sure they licensed a compression codec from RAD and implemented it in hardware, which is why storage is so fast on the PS5. Sounds like they're doing the same thing for GPU transfers now.

    • wtallis 3 days ago

      Storage on the PS5 isn't really fast. It's just not stupidly slow. At the time of release, the raw SSD speeds for the PS5 were comparable to the high-end consumer SSDs of the time, which Sony achieved by using a controller with more channels than usual so that they didn't have to source the latest NAND flash memory (and so that they could ship with only 0.75 TB capacity). The hardware compression support merely compensates for the PS5 having much less CPU power than a typical gaming desktop PC. For its price, the PS5 has better storage performance than you'd expect from a similarly-priced PC, but it's not particularly innovative and even gaming laptops have surpassed it.

      The most important impact by far of the PS5 adopting this storage architecture (and the Xbox Series X doing something similar) is that it gave game developers permission to make games that require SSD performance.

      • jesse__ 3 days ago

        So, you're saying they built a novel storage architecture that competed with state-of-the-art consumer hardware, at a lower price point. Five years later, laptops are just catching up, and that at the same price point, it's faster than what you'd expect from a PC.

        The compression codec they licensed was built by some of the best programmers alive [0], and was later acquired by Epic [1]

        I dunno how you put those together and come up with "isn't really fast" or "not particularly innovative".

        Fast doesn't mean 'faster than anything else in existence'. Fast is relative to other existing solutions with similar resource constraints.

        [0] https://fgiesen.wordpress.com/about/ [1] https://www.epicgames.com/site/en-US/news/epic-acquires-rad-...

viktorcode 3 days ago

This video is a direct continuation of the one where Cerny explains logic behind PlayStation 5 pro design and telling that the path forward for them goes into rendering near perfect low res image then upscaling it with neural networks to 4K.

How good it will be? Just look at the current upscalers working on perfectly rendered images - photos. And they aren't doing it in realtime. So the errors, noise, and artefacts are all but inevitable. Those will be masked by post processing techniques that will inevitably degrade image clarity.

  • wartywhoa23 3 days ago

    It only takes a marketing psyop to alter the perception of the end user with the slogans along the lines of "Tired of pixel exactness, hurt by sharpness? Free YOUR imagination and embrace the future of ever-shifting vague forms and softness. Artifact stands for Art!"

    • LtdJorge 3 days ago

      I’m replaying CP2077 for the third time, and all the sarcastic marketing material and ads you find in the game, don’t seem so sarcastic after all when you really think about the present.

      • bigyabai 3 days ago

        If you think those are uncanny, wait until you hear the ads in GTAV.

        • nxobject 3 days ago

          Pepperidge Farm remembers the days of “Pißwasser, this is beer! Drive drunk, off a pier!”

          And, luckily enough, craft beer in the US has only gotten better since then.

  • jayd16 3 days ago

    I don't know, I think it's conceivable that you could get much much better results from a custom upscale per game.

    You can give much more input than a single low res frame. You could throw in motion vectors, scene depth, scene normals, unlit color, you could separately upscale opaque, transparent and post process effect... I feel like you could really do a lot more.

    Plus, aren't cellphone camera upscalers pretty much realtime these days? I think you're comparing generating an image to what would actually be happening.

    • wtallis 3 days ago

      > I think it's conceivable that you could get much much better results from a custom upscale per game.

      > You can give much more input than a single low res frame. You could throw in motion vectors, scene depth, scene normals, unlit color, you could separately upscale opaque, transparent and post process effect... I feel like you could really do a lot more.

      NVIDIA has already been down that road. What you're describing is pretty much DLSS, at various points in its history. To the extent that those techniques were low-hanging fruit for improving upscaler quality, it's already been tried and adopted to the extent that it's practical. At this point, it's more reasonable to assume that there isn't much low-hanging fruit for further quality improvements in upscalers without significant hardware improvements, and that the remaining artifacts and other downsides are hard problems.