• novacomets@lemmy.myserv.one
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    5
    ·
    4 days ago

    Am I alone and being firmly against upscaling, never tried it, and everything is played or native resolution only? There have been games that only used 20% of GPU so I doubled game reaultion, but never upscaling.

    • Grofit@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      4 days ago

      For some games I’m happy to turn it on as it drops power/temps and provides virtually identical output (as far as i can tell anyway) to native, but my fans don’t need to go into overdrive mode.

      I may even put on frame gen too if I just want to bump a stable-ish 80-90fps to stable 120fps, and again drops power and temps slightly. That sometimes does cause smearing but for the most part I don’t notice enough to be annoyed. Without them I would probably be running with more power draw and higher temps, and possibly still not even hitting lower resolutions at 120, some games as you say though can hit 120 no problem even without and the gpu won’t be stressed.

      • MonkderVierte@lemmy.ml
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        3 days ago

        want to bump a stable-ish 80-90fps to stable 120fps, and again drops power and temps slightly. That sometimes does cause smearing but for the most part I don’t notice enough to be annoyed.

        So why do you want high fps? The issue with low fps is, that it causes smearing.

        I don’t notice smearing much and run my games/display at 30 fps.

        And then there are games with motion blur effect, silly, right?

        • Grofit@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          3 days ago

          It’s a personal preference but I would take occasional smearing over janky frame rate, I don’t know why but if you are not a solid 60,90,120 it just feels really like there is a stutter every second or two even though it should be fine if it’s above 60fps.

          • MonkderVierte@lemmy.ml
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            3 days ago

            That’s the point; LCD/LED don’t stutter but smear. It’s some optical effect, because LCD don’t “refresh” but always have the static image until next frame, unlike CRT.

        • Grofit@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 days ago

          In some games it’s super obvious that it’s doing upscaling and looks awful, some games with without upscaling look awful (TAA), it’s odd that in some cases tools like Dlss can look better than native TAA.

          For the most part though it feels like the FLAC vs Opus or whatever, most people can’t notice difference and don’t care enough about it, but some people can tell the difference and want the best, I don’t think either are wrong it’s just down to how much you notice/can tolerate before it’s annoying.

    • iAmTheTot@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      2
      ·
      4 days ago

      I’m sure you’re not alone but can I ask why? Some games use it better than others, and the tech has come a long way. If I have to choose between native 4k and 40fps, or 80 or even more fps with tiny artifacts that I only really notice when I’m actively pixel peeping… I mean, I’ll take the latter, personally.

      • Fenrisulfir@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        3 days ago

        But with those options, your latency is still equivalent to 40fps so it won’t feel any snappier and now you’re introducing artifacts so the image quality is worse. I get upscaling. I don’t get why anyone would ever enable frame gen.

        • iAmTheTot@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          edit-2
          3 days ago

          Neither I or the person I was replying to were talking about frame gen.

          As for the latency, not in a way that I notice. I’d rather play 80 or more fps upscaled with minute artifacts that I don’t notice, than 40 fps native 4k, just as I said. It feels much better to me, even if it’s placebo.

      • novacomets@lemmy.myserv.one
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        4
        ·
        edit-2
        3 days ago

        I view upscaling as giving a developers a cop out for not having to optimize games, as well as an admission of noting capable of enginner gaming GPU’s that can do 4K ultra path tracing 165fps.

        I’m also getting suspiciois that AAA+ games are dismissing original wring and story development to replace with flashy graphics and then sell it on how good the game looks. Indie studies don’t have the budget for upscalling, no 6 year old games from anybody has upscaling, I find it to be more of a gimmick than a solution. Nobody plays a game 4 years after release because of how good the game looks for grahics.