I saw a meme about something of fake frames, but i don’t know what happened.

  • givesomefucks@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    3
    ·
    edit-2
    6 months ago

    Fake frames is “frame generation” for Nvidia it’s called DLLS.

    Rather than having the graphics card create 120 frames, you can crank the settings up to where you only get 60, then AI “guesses” what the next frame would show doubling it to 120 but keeping the higher settings.

    This can make things blurry because the AI may guess wrong. So every odd frame is real, every even frame is just a guess.

    Frame 1: real

    Frame 2: guess

    Frame 3: real

    If the guess for #2 is accurate, everything is cool, if #2 guessed a target moves left when it moved right then #3 corrects and that “blink” is the problem.

    The bigger issue is developers relying on that tech so they don’t have to optimize code. So rather than DLSS being an extra ompf, it’s going to be required for “acceptable” performance

    • stankmut@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      6 months ago

      To add on to this, the 5000 series now generates 3 fake frames per real frame instead of just 1.

          • NewNewAccount@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            6 months ago

            Yeah not sure if there’s a better word to use without coming across as pedantic.

            Fake certainly implies these are worse (which they of course are), but I’m not sure if they’re that much worse. I think in many scenarios the proverbial juice would absolutely be worth the squeeze, but naysayers seem to disagree with that sentiment.

    • ch00f@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      6 months ago

      Can someone explain how AI can generate a frame faster than the conventional method?

      • MrPoopbutt@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        6 months ago

        It is image processing with statiatics rather than traditional rendering. It is a completely separate process. Also, NVidia GPUs (and the new upcoming AMD ones too) also have hardware built into the chip specifically for this.

        • ch00f@lemmy.world
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          6 months ago

          Which part? I mean even if it isn’t generating the frames well, it’s still doing the work. So that capability is there. What’s the grift?

          • RubberElectrons@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            edit-2
            6 months ago

            That it’s reliable. The key point they’re selling is that devs don’t need to optimize their engines as much, of course obfuscated under a lot of other value-adds.

            I’d go further than this and say part of our problems are generally that optimization of code isn’t a focus anymore. Apps which merely interface with web APIs are more than 90mb sometimes. That’s embarrassing.

            That an AI can step in as savior for poor coding practices, is really a bandage stuck on the root cause.