• giloronfoo@beehaw.org
    link
    fedilink
    arrow-up
    7
    ·
    6 months ago

    Competitive (professional) gamers?

    Seems there are diminishing returns, but at least some gains are measurable at 360.

    • Fushuan [he/him]@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      ·
      6 months ago

      In thought that 60Hz was enough for most games, and that for shooters and other real time games 120 or 144 was better. However, it reaches a point where the human eye can’t notice even if it tried.

      Honestly, going up in framerate t9o much is just a waste of GPU potency and electricity.

      • narc0tic_bird@lemm.ee
        link
        fedilink
        arrow-up
        10
        ·
        6 months ago

        A better way to look at this is frametime.

        At 60 FPS/Hz, a single frame is displayed for 16.67ms. At 120 Hz, a single frame is displayed for 8.33ms. At 240 Hz, a single frame is displayed for 4.16ms. A difference of >8ms per frame (60 vs 120) is quite noticeable for many people, and >4ms (120 vs 240) is as well, but the impact is just half as much. So you get diminishing returns pretty quickly.

        Now I’m not sure how noticeable 1000 Hz would be to pretty much anyone as I haven’t seen a 1000 Hz display in action yet, but you can definitely make a case for 240 Hz and beyond.

      • jsomae@lemmy.ml
        link
        fedilink
        arrow-up
        4
        ·
        6 months ago

        It’s pretty easy to discern refresh rate with the human eye if one tries. Just move your cursor back and forth really quickly. The number of ghost cursors in the trail it leaves behind (which btw only exist in perception by the human eye) is inversely proportional to the refresh rate.

        • Fushuan [he/him]@lemm.ee
          link
          fedilink
          English
          arrow-up
          3
          ·
          6 months ago

          Sure, but wasting double or triple the resources for that is not fine. There’s very limited places where that even is a gain on games, because outside those super competitive limited games it’s not like it matters.

          • jsomae@lemmy.ml
            link
            fedilink
            arrow-up
            4
            ·
            6 months ago

            Yeah I agree with you, but I was just refuting your claim that it’s not perceivable even if you try.

            • Fushuan [he/him]@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              ·
              6 months ago

              oh, yeah I’ve read and heard of plenty people saying that they definitely notice it. I’m lucky enough not to because most ARPGs don’t run 60FPS on intense combat, let alone 120 fps on a rtx3080 lmao.

              I was talking more about the jump from 240 and beyond, which I find surprising for people to notice the upgrade on intense gaming encounters, not while calmly checking or testing. I guess that there’s people who do notice, but again, running games on such high tick rate is very expensive for the gpu and a waste most of the time.

              I’m just kinda butthurt that people feel like screens below 120 are bad, when most games I play hardly run 60 fps smooth, because the market will follow and in some years we will hardly have what I consider normal monitors, and the cards will just eat way more electricity for very small gains.