Comment by cyanydeez

Comment by cyanydeez 3 days ago

15 replies

Im still watching 720p movirs, video games.

Somewhere between 60 hz and 240hz, theres zero fundamental benefits. Same for resolution.

It isnt just that hardware progress is a sigmoid, our experiential value.

The reality is that exponential improvement is not a fundamental force. Its always going to find some limit.

majkinetor 3 days ago

On my projector (120 inch) the difference between 720p and 4k is night and day.

  • crote 3 days ago

    Screen size is pretty much irrelevant, as nobody is going to be watching it at nose-length distance to count the pixels. What matters is angular resolution: how much area does a pixel take up in your field of vision? Bigger screens are going to be further away, so they need the same resolution to provide the same quality as a smaller screen which is closer to the viewer.

    Resolution-wise, it depends a lot on the kind of content you are viewing as well. If you're looking at a locally-rendered UI filled with sharp lines, 720p is going to look horrible compared to 4k. But when it comes to video you've got to take bitrate into account as well. If anything, a 4k movie with a bitrate of 3Mbps is going to look worse than a 720p movie with a bitrate of 3Mbps.

    I definitely prefer 4k over 720p as well, and there's a reason my desktop setup has had a 32" 4k monitor for ages. But beyond that? I might be able to be convinced to spend a few bucks extra for 6k or 8k if my current setup dies, but anything more would be a complete waste of money - at reasonable viewing distances there's absolutely zero visual difference.

    We're not going to see 10.000Hz 32k graphics in the future, simply because nobody will want to pay extra to upgrade from 7.500Hz 16k graphics. Even the "hardcore gamers" don't hate money that much.

  • Vvector 3 days ago

    Does an increased pixel count make a bad movie better?

    • Mawr 3 days ago

      Does a decreased pixel count make a good movie better?

      • [removed] 2 days ago
        [deleted]
Mawr 3 days ago

Lower latency between your input and its results appearing on the screen is exactly what a fundamental benefit is.

The resolution part is even sillier - you literally get more information per frame at higher resolutions.

Yes, the law of diminishing returns still applies, but 720p@60hz is way below the optimum. I'd estimate 4k@120hz as the low end of optimal maybe? There's some variance w.r.t the application, a first person game is going to have different requirements from a movie, but either way 720p ain't it.

IlikeKitties 3 days ago

> Im still watching 720p movirs, video games.

There's a noticeable and obvious improvement from 720 to 1080p to 4k (depending on the screen size). While there are diminishing gains, up to at least 1440p there's still a very noticeable difference.

> Somewhere between 60 hz and 240hz, theres zero fundamental benefits. Same for resolution.

Also not true. While the difference between 40fps and 60fps is more noticeable than say from 60 to 100fps, the difference is still noticeable enough. Add the reduction in latency that's also very noticeable.

  • saulpw 3 days ago

    Is the difference between 100fps and 240fps noticeable though? The OP said "somewhere between 60hz and 240hz" and I agree.

    • unethical_ban 3 days ago

      Somewhere between a shoulder tap and a 30-06 there is a painful sensation.

      The difference between 60 and 120hz is huge to me. I havent had a lot of experience above 140.

      Likewise, 4k is a huge difference in font rendering, and 1080->1440 is big in gaming.

      • drawfloat 3 days ago

        4K is big but certainly was not as big a leap forward as SD to HD

    • IlikeKitties 3 days ago

      That would be very obvious and immediately noticeable difference but you need enough FPS rendered (natively not with latency increasing frame generation) and a display that can actually do 240hz without becoming a smeary mess.

      If you have this combination and you play with it for an hour and you go back to a locked 100hz Game you would never want to go back. It's rather annoying in that regard actually.

      • oivey 3 days ago

        Even with frame generation it is incredibly obvious. The latency for sure is a downside, but 100 FPS vs 240 FPS is extremely evident to the human visual system.

    • theshackleford 2 days ago

      > Is the difference between 100fps and 240fps noticeable though?

      Yes.

      > The OP said "somewhere between 60hz and 240hz" and I agree.

      Plenty of us dont. A 240hz OLED still provides a signifacntly blurrier image in motion than my 20+ year old CRT.

      • drougge 2 days ago

        Surely that 20+ year old CRT didn't run at more than 240Hz? Something other than framerate is at play here.

        • theshackleford 2 days ago

          > Surely that 20+ year old CRT didn't run at more than 240Hz?

          It didnt have too.

          > Something other than framerate is at play here.

          Yes, sample and hold motion blur, inherent to all modern display types commonly in use for the most part.

          Even at 240hz, modern displays can not match CRT for motion quality.

          https://blurbusters.com/faq/oled-motion-blur/