I saw a meme about something of fake frames, but i don’t know what happened.
Fake frames is “frame generation” for Nvidia it’s called
DLLS.Rather than having the graphics card create 120 frames, you can crank the settings up to where you only get 60, then AI “guesses” what the next frame would show doubling it to 120 but keeping the higher settings.
This can make things blurry because the AI may guess wrong. So every odd frame is real, every even frame is just a guess.
Frame 1: real
Frame 2: guess
Frame 3: real
If the guess for #2 is accurate, everything is cool, if #2 guessed a target moves left when it moved right then #3 corrects and that “blink” is the problem.
The bigger issue is developers relying on that tech so they don’t have to optimize code. So rather than DLSS being an extra ompf, it’s going to be required for “acceptable” performance
To add on to this, the 5000 series now generates 3 fake frames per real frame instead of just 1.
Is “fake” being used as a pejorative here?
I was just using the term that the previous commenter used to keep terms consistent.
Yeah not sure if there’s a better word to use without coming across as pedantic.
Fake certainly implies these are worse (which they of course are), but I’m not sure if they’re that much worse. I think in many scenarios the proverbial juice would absolutely be worth the squeeze, but naysayers seem to disagree with that sentiment.
Yes.
Can someone explain how AI can generate a frame faster than the conventional method?
It is image processing with statiatics rather than traditional rendering. It is a completely separate process. Also, NVidia GPUs (and the new upcoming AMD ones too) also have hardware built into the chip specifically for this.
(that’s part of the grift)
Which part? I mean even if it isn’t generating the frames well, it’s still doing the work. So that capability is there. What’s the grift?
That it’s reliable. The key point they’re selling is that devs don’t need to optimize their engines as much, of course obfuscated under a lot of other value-adds.
I’d go further than this and say part of our problems are generally that optimization of code isn’t a focus anymore. Apps which merely interface with web APIs are more than 90mb sometimes. That’s embarrassing.
That an AI can step in as savior for poor coding practices, is really a bandage stuck on the root cause.
I see, thank you