Say hello to the super-speedy 27-inch UltraGear GX7.

  • SpaceNoodle@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    6 days ago

    At that point, was it really the rendering speed, or just the finer game engine granularity that made a difference?

    • FooBarrington@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      5 days ago

      What is “game engine granularity” supposed to mean?

      Unless they fucked up the test, the only difference is how fast pictures arrive on the screen. If the test showed that pro players were able to tell a difference, it’s reasonable to assume that this is actually the case, unless you can show a flaw in their test setup.

      • Artyom@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        5 days ago

        A frame is not just the picture arriving to the screen, it’s everything from input processing to game logic to rendering to picture arriving at the screen. What the other commenter was saying is that things like input lag and game logic smoothness should affect player performance as well. In fact, you can isolate for those variables with an unlocked frame rate, where you can get a frame rate in the 250s on a 144Hz monitor, and pros still see an improvement in that case because those hidden subcycles are smoothing out the non-visual calculations.

        • FooBarrington@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          5 days ago

          Sure, but why do you expect the input lag to be different for the different monitors that were tested? If that’s the case, we should be able to point at those differences in the test setup, instead of saying “yeah, they were probably just too dumb”.

          The game logic also shouldn’t be different, as CS logic hasn’t been reliant on frame rates for a long time (if ever).