Comment by rafaelmn

Comment by rafaelmn 3 days ago

41 replies

I disagree - current gen console aren't enough to deliver smooth immersive graphics - I played BG3 on PS first and then on PC and there's just no comparing the graphics. Cyberpunk same deal. I'll pay to upgrade to consistent 120/4k and better graphics, and I'll buy the games.

And there are AAA that make and will make good money with graphics being front and center.

Ntrails 3 days ago

>aren't enough to deliver smooth immersive graphics

I'm just not sold.

Do I really think that BG3 being slightly prettier than, say, Dragon Age / Skyrim / etc made it a more enticing game? Not to me certainly. Was cyberpunk prettier than Witcher 3? Did it need to be for me to play it?

My query isn't about whether you can get people to upgrade to play new stuff (always true). But whether they'd still upgrade if they could play on the old console with worse graphics.

I also don't think anyone is going to suddenly start playing video games because the graphics improve further.

  • rafaelmn 3 days ago

    > Do I really think that BG3 being slightly prettier than, say, Dragon Age / Skyrim / etc made it a more enticing game?

    Absolutely - graphical improvements make the game more immersive for me and I don't want to go back and replay the games I spent hundreds of hours in mid two thousands, like say NVN or Icewind Dale (never played BG 2). It's just not the same feeling now that I've played games with incomparable graphics, polished mechanics and movie level voice acting/mocap cutscenes. I even picked up Mass Effect recently out of nostalgia but gave up fast because it just isn't as captivating as it was back when it was peak graphics.

    • adlpz 3 days ago

      Well this goes to show that, as some other commenter said, the gamer community (whatever that is) is indeed very fragmented.

      I routinely re-play games like Diablo 2 or BG1/2 and I couldn't care less about graphics, voice acting or motion capture.

    • BolexNOLA 3 days ago

      > Absolutely - graphical improvements make the game more immersive for me

      Exactly. Graphics are not the end all be all for assessing games, but it’s odd how quickly people handwave away graphics in a visual medium.

      • badsectoracula 2 days ago

        > it’s odd how quickly people handwave away graphics in a visual medium.

        There is a difference between graphics as in rendering (i.e. the technical side, how something gets rendered) and graphics as in aesthetics (i.e. visual styles, presentation, etc).

        The latter is important for games because it can be used to evoke some feel to the player (e.g. cartoony Mario games or dreadful Silent Hill games). The former however is not important by itself, its importance only comes as means to achieve the latter. When people handwave away graphics in games they handwave the misplaced focus on graphics-as-in-tech, not on graphics-as-in-aesthetics.

      • kbolino 3 days ago

        Maximal "realism" is neither the only nor even necessarily the best use of that medium.

    • badpun 3 days ago

      For me, the better graphics, mocap etc., the stroger the uncanny valley feeling - i.e. I stop perceiving it as a video game, but instead see it as an incredibly bad movie.

    • theshackleford 2 days ago

      > I don't want to go back and replay the games I spent hundreds of hours in mid two thousands, like say NVN or Icewind Dale (never played BG 2). It's just not the same feeling now that I've played games with incomparable graphics, polished mechanics and movie level voice acting/mocap cutscenes. I even picked up Mass Effect recently out of nostalgia but gave up fast because it just isn't as captivating as it was back when it was peak graphics.

      And yet many more have no such issue doing exactly this. Despite having a machine capable of the best graphics at the best resolution, I have exactly zero issues going back and playing older games.

      Just in the past month alone with some time off for surgery I played and completed Quake, Heretic and Blood. All easily as good, fun and as compelling as modern titles, if not in some ways better.

  • keyringlight 3 days ago

    Two aspects I keep thinking about:

    -How difficult it must be for the art/technical teams at game studios to figure out for all the detail they are capable of putting on screen how much of it will be appreciated by gamers. Essentially making sure that anything they're going to be budgeting significant amount of worker time to creating, gamers aren't going to run right past it and ignore or doesn't contribute meaningfully to 'more than the sum of its parts'.

    -As much as technology is an enabler for art, alongside the install base issue how well does pursuing new methods fit how their studio is used to working, and is the payoff there if they spend time adapting. A lot of gaming business is about shipping product, and the studios concern is primarily about getting content to gamers than chasing tech as that is what lets their business continue, selling GPUs/consoles is another company's business.

pjmlp 3 days ago

Being an old dog that still cares about gaming, I would assert many games are also not taking advantage of current gen hardware, coded in Unreal and Unity, a kind of Electron for games, in what concerns taking advantage of existing hardware.

There is a reason there are so many complaints in social media about being obvious to gamers in what game engine a game was written on.

It used to be that game development quality was taken more seriously, when they were sold via storage media, and there was a deadline to burn those discs/cartridges.

Now they just ship whatever is done by the deadline, and updates will come later via a DLC, if at all.

  • jayd16 3 days ago

    They're both great engines. They're popular and gamers will lash out at any popular target.

    If it was so simple to bootstrap an engine no one would pay the percentage points to Unity and Epic.

    The reality is the quality bar is insanely high.

    • gyomu 3 days ago

      It is pretty simple to bootstrap an engine. What isn’t simple is supporting asset production pipelines on which dozen/hundreds of people can work on simultaneously, and on which new hires/contractors can start contributing right away, which is what modern game businesses require and what unity/unreal provide.

  • formerly_proven 3 days ago

    Unreal and Unity would be less problematic if these engines were engineered to match the underlying reality of graphics APIs/drivers, but they're not. Neither of these can systematically fix the shader stuttering they are causing architecturally, and so essentially all games built on these platforms are sentenced to always stutter, regardless of hardware.

    Both of these seem to suffer from incentive issues similar to enterprise software: They're not marketing and selling to either end users or professionals, but studio executives. So it's important to have - preferably a steady stream of - flashy headline features (e.g. nanite, lumen) instead of a product that actually works on the most basic level (consistently render frames). It doesn't really matter to Epic Games that UE4/5 RT is largely unplayable; even for game publishers, if you can pull nice-looking screenshots out of the engine or do good-looking 24p offline renders (and slap "in-game graphics" on them), that's good enough.

    • Uvix 3 days ago

      The shader stutter issues are non-existent on console, which is where most of their sales are. PC, as it has been for almost two decades, is an afterthought rather than a primary focus.

      • MountainTheme12 3 days ago

        No, that's not the reason.

        The shader stutter issues are non-existent on console because consoles have one architecture and you can ship shaders as compiled machine code. For PC you don't know what architecture you will be targeting, so you ship some form of bytecode that needs to be compiled on the target machine.

      • keyringlight 3 days ago

        If anything I think PC has been a prototyping or proving grounds for technologies on the roadmap for consoles to adopt. It allows software and hardware iterations before it's relied upon in a platform that is required to be stable and mostly unchanging for around a decade from designing the platform through developers using it and recently major refreshes. For example from around 2009 there were a few cross platform games with the baseline being 32bit/DX9 capabilities, but optional 64bit/DX11 capabilities, and given the costs and teams involved in making the kind of games which stretch those capabilities I find it hard to believe it'd be one or a small group of engineers putting significant time into an optional modes that aren't critical to the game functioning and supporting them publicly. Then a few years later that's the basis of the next generation of consoles.

      • jayd16 3 days ago

        You know the hardware for console so you can ship precompiled shaders.

        Can't do that for PC so you either have long first runs or stutter for JIT shader compiles.

    • jayd16 3 days ago

      Imagine living in a reality where the studio exec picks the engine based on getting screenshots 3 years later when there's something interesting to show.

      I mean, are you actually talking from experience at all here?

      It's really more that engines are an insane expense in money and time and buying one gets your full team in engine far sooner. That's why they're popular.

flohofwoe 3 days ago

Just get a PC then? ;) In the end, game consoles haven't been much more than "boring" subsidized low-end PCs for quite a while now.

  • rafaelmn 3 days ago

    PC costs a lot and depreciates fast, by the end of a console lifecycle I can still count on developers targeting it - PC performance for 6+ year hardware is guaranteed to suck. And I'm not a heavy gamer - I'll spend ~100h on games per year, but so will my wife and my son - PC sucks for multiple people using it - PS is amazing. I know I could concoct some remote play setup via lan on TV to let my wife and kids play but I just want something I spend a few hundred eur and I plug into the TV and then it works.

    Honestly the only reason I caved with the GPU purchase (which cost the equivalent of a PS pro) was the local AI - but in retrospect that was useless as well.

    • theshackleford 2 days ago

      > by the end of a console lifecycle I can still count on developers targeting it

      And I can count on those games still being playable on my six year old hardware because they are in fact developed for 6 year old hardware.

      > PC performance for 6+ year hardware is guaranteed to suck

      For new titles at maximum graphics level sure. For new titles at the kind of fidelity six year old consoles are putting out? Nah. You just drop your settings from "ULTIMATE MAXIMUM HYPER FOR NEWEST GPUS ONLY" to "the same low to medium at best settings the consoles are running" and off you go.

    • randomNumber7 3 days ago

      Oh yeah it's great to play PS4 games while the thing runs with the noise of a vacuum cleaner.

gnulinux996 2 days ago

> current gen console aren't enough to deliver smooth immersive graphics

The Last of Us franchise, especially part 2 have been the most immersive experiences that I have had in gaming.

This game pretty much told me that the PlayStation is more than capable of delivering this kind of experiences.

Now, if some of those high budget so-called AAA games cannot deliver not even a fraction of that - I believe - is on them.

wiseowise 3 days ago

> current gen console aren't enough to deliver smooth immersive graphics

They were enough since PS4 era to deliver smooth, immersive graphics.