• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

as an AMD fanboy, I gotta say Nvidia's multi frame gen is just lit, even at 1080p

I've been using MFG on Nvidia since the 4070's launch and it absolutely does not feel as smooth as native high refresh. Motion fluidity is definitely an improvement, but you can tell it's not the same as native high refresh because certain things are only being updated at half the speed - most notably anything temporal such as RT denoising or ambient occlusion, screen-space effects like SSAO and SSR, as well as all of the artifacts from errors in the fake frames that "flicker" at half the MFG framerate (when using 2x frame-gen).

If you're a 60Hz or 75Hz gamer then you probably cannot feel the difference between 120Hz input lag and 240Hz input lag. If you've been a high-refresh gamer for several years already, the sluggishness and feel of MFG is notably worse than even the base framerate.

Lets say you have a system that can render at 100fps natively, or 185fps using 2x MFG. That 185fps is a base framerate of 92fps, but because of the additional processing that MFG entails, there's some additional lag on top of what a native 92fps experience would normally be. It's going to feel like the game is running at about 75fps natively, killing the entire point of a high-refresh experience for so many people.
That's why you should be using it when CPU bound. It doesn't really drop your base framerate. It still adds latency of course, but it's much more manageable.
 
That's why you should be using it when CPU bound. It doesn't really drop your base framerate. It still adds latency of course, but it's much more manageable.

In the games I've tried it in the base frame rate takes a hit at each step up to about 10% from off to 4x.


100% agree though it's best use case is in cpu limited scenarios.
 
I've been using MFG on Nvidia since the 4070's launch and it absolutely does not feel as smooth as native high refresh. Motion fluidity is definitely an improvement, but you can tell it's not the same as native high refresh because certain things are only being updated at half the speed - most notably anything temporal such as RT denoising or ambient occlusion, screen-space effects like SSAO and SSR, as well as all of the artifacts from errors in the fake frames that "flicker" at half the MFG framerate (when using 2x frame-gen).

If you're a 60Hz or 75Hz gamer then you probably cannot feel the difference between 120Hz input lag and 240Hz input lag. If you've been a high-refresh gamer for several years already, the sluggishness and feel of MFG is notably worse than even the base framerate.

Lets say you have a system that can render at 100fps natively, or 185fps using 2x MFG. That 185fps is a base framerate of 92fps, but because of the additional processing that MFG entails, there's some additional lag on top of what a native 92fps experience would normally be. It's going to feel like the game is running at about 75fps natively, killing the entire point of a high-refresh experience for so many people.

You can't use MFG on an 4070, only FG 2x

Anyways for me, FG 2x is a step up in gaming experience over 60-80FPS without FG in lots of games, and that is with the old DLSS3 Frame Generation running on Optical Flow (which has lower FPS --> worse latency than DLSS4 FG). In games where I can already get > 100FPS then yeah sure Frame Gen is not need, but that is only true when I have 120hz monitor.

Right now MFG 3x would play best on 240hz and MFG 4x on 360hz monitor
 
I think it's time we stop treating frame generation as a late 2000's smart TV's dumb frame interpolation, much as been done at the hardware and software level to mitigate that
Frame interpolation has it's uses, fixing low framerates is not one of them.
 
@_roman_
Christ, I just saw your edit. Use quotes, use tags, something. It’s impossible to engage with you when you just answer with a post number, it DOESN’T get a notification.

Anyway, to the nonsense.
edit: to make it 100% clear for post #43

Page 1: How Ada advances the science of graphics with DLSS 3.
DLSS3 = old technology years old
…there is a second link with a whitepaper for DLSS4. It’s literally there in my post. Are… are you fucking with me?

My response was to that "NVidia has a lot of problems, but that’s not one of them. They ALWAYS release whitepapers on their tech." and my statement -- if you are unable to explain. Do not take it personally, when you provide me with a bad document. That is a marketing hoax document. But not a datasheet for electronics or some other proper tech pdf file. It'S marketing. I'm sure that pdf content I may find it quite fast on the net as the usual marketing hoax sliedshow.
Are you asking me to explain or people who actually work on the tech? What is it that you even want? Of course NV won’t provide in-depth algorithms for their tech - it’s proprietary and there is no reason for them to do so. All they CAN do is give a broad technical overview. Expecting a full breakdown of HOW precisely they do what they do is nonsensical, this isn’t an open-source solution.

Is this your first time talking to him? :D
I definitely didn’t expect these levels of “what the actual fuck are you talking about”, no.
 
Looks, does not feel. Depends heavily on your base FPS though. 60 FPS or more is recommended for both FSR and DLSS frame gen. I've tried both and dont care for either. If you want a decent experience, you can turn on NVIDIA Reflex to help with the issues of latency with frame gen on nvidia's side of things. It will still have generally worse latency than native though, and some people can notice it extremely easily (like me), and others cannot. I'm sure its nice for older gamers too.

Honestly, you just need to find what your comfortable with, and use it. I don't like frame gen, not necessarily because I don't like the tech (I think its neat), but the implications that it could mean for the already diminishing returns on generational uplift we've been seeing on average.

I think framegen is perfectly okay in games like Cyperpunk 2077 but honestly should have no place in games like Marvel Rivals. Singleplayer games are perfect for framegen, whereas framegen in compeitive games.. yucky. I dont think I need to explain why its bad for compeitive games.
This thread is for trolling
 
FG was somewhat proposed for older hardware:

“I think this is primarily a question of optimization and also engineering and then the ultimate user experience, so we’re launching this frame generation, the best multi-frame generation technology with the 50-series, and we’ll see what we’re able to squeeze out of older hardware in the future - Bryan Catanzaro

Any update on whether the 30-series will get a taste of FG? Or maybe that ship has already sailed?

Also, anyone using AMDs AFMF on 30-series with uplifting input?
 
FG was somewhat proposed for older hardware:

“I think this is primarily a question of optimization and also engineering and then the ultimate user experience, so we’re launching this frame generation, the best multi-frame generation technology with the 50-series, and we’ll see what we’re able to squeeze out of older hardware in the future - Bryan Catanzaro

Any update on whether the 30-series will get a taste of FG? Or maybe that ship has already sailed?

Also, anyone using AMDs AFMF on 30-series with uplifting input?

You mean FSR 3 frame generation? Because AFMF is AMD's global FG exclusive to Radeon. My experience with FSR FG has been extremely poor, but I only really tried it on FF16, where DLSS itself doesn't run too good.
 
I'm still waiting for a blind test where 2 PC running with 360hz monitors, one with 4x MFG and one without FG, and see where people find which PC offer the better gaming experience.

Right now it's just personal opinions coming from the hardware reviewers who hardly play game who are against MFG

yeah thank you for this saying this, this is what i was trying to convey with my earlier posts.
 
I think this discution should pe separated in to singleplayer and multiplayer games. I don't care about input lag when i F5 save/F8 load all the time in a game. But it dose matter if you miss a shot or a click in an online PvP scenario.

I don't use FG, but i have fallen in love with DLSS4 - Transformer, it looks good compared to TAA and it runs a lot better. It's one of the reasons i returned a RX 9070 and will probably wait for 5070 SUPER to upgrade from my 4070.
 
yeah thank you for this saying this, this is what i was trying to convey with my earlier posts.

Right now it's just a trade off and just comes down to preference.

Smoother image vs higher latency/artifacts.

Where it works best is either when the base framerate is already high or when cpu limited with the latter being the more useful scenario becuase honestly if my framerate is already decent I'd rather not use it in the first place and when the base framerate is low it sucks because all it's downsides become even more obvious.

Like others have said not every person has the same perception of the artifacts or latency increase to me both are obvious to my wife I doubt she'd be able to tell them apart and would probably think the artifacts are just graphical glitches inherent in the game engine.

It can even vary on a game to game bases some developers do decent enough job others not so much. DLSS SR isn't good in every game as it is some implementations are terrible FG is no different.
 
yeah thank you for this saying this, this is what i was trying to convey with my earlier posts.

Wait until I get my hands on the 5090 and some 240hz/480hz dual mode OLED :roll:
 
So...personal preference is always going to be on a non-interpolated image. That being said, what I hear from Nvidia and from AMD is the same thing.

Taken in broad stroked, and with some silly numbers, I'd like to try and express this from a gameplay perspective. Because you are taking 1 frame, and generating 3, on paper you get 4x the FPS. In reality, you get much less. The interpolated frames could potentially be twice as long to generate, meaning 4/2 = 2x the actual frame rate. Because you have variation in complexity of what is rendered, one frame quartet could take 1 second, another could take 2 seconds, and you thus get frame timing issues despite having many, many more frames. FSR and DLSS both take this, attempt to apply more algorithms to smooth everything out...and in my opinion really don't offer anything more than trying to deliver more frames that simply retard the motion which would be intended. IE, this would be absolutely the way to generate extra frames to be smooth...if everything moved fluidly.

Unfortunately, the pattern is that there's another derivative. The derivative of position is velocity. The derivative of velocity is acceleration. The derivative of acceleration is jerk. Frame interpolation, which is what DLSS and FSR attempt, basically flattens out any jerk and people who know natural systems have a very hard time whenever they detect spikes in jerk that their eyeballs cannot call out, but that they cannot un-see once pointed out. Call me old fashioned, but objects with instantaneous spikes in jerk are silly difficult to deal with, and this is why I'm against the idea of frame generation, interpolation, or whatever you want to call it. I'm also someone who has a splitting headache at a 70 degree FOV...so power to the people that like this stuff. I just can't deal with this crap, and would choose 60 FPS over 180 every day of the week if the later was framegen...because bigger numbers are not always a better experience.
 
Wait until I get my hands on the 5090 and some 240hz/480hz dual mode OLED :roll:

Damn, you're still waiting. They gotten easier to grab at microcenter they've had a few the last couple times I was there at inflated prices ofc although I saw a gigabyte model for around 2700 lol chump change....
 
I think this discution should pe separated in to singleplayer and multiplayer games. I don't care about input lag when i F5 save/F8 load all the time in a game. But it dose matter if you miss a shot or a click in an online PvP scenario.

I don't use FG, but i have fallen in love with DLSS4 - Transformer, it looks good compared to TAA and it runs a lot better. It's one of the reasons i returned a RX 9070 and will probably wait for 5070 SUPER to upgrade from my 4070.
Same here with Transformer DLSS.
Nowadays if a game supports that via the Nvidia App then I don't even think about playing it natively especially with TAA. 'it even looks great on my 21:9 2560x1080 res monitor'

That alone is enough for me to stick to Nvidia regardless of what anyone says.
FG/MFG sure I will at least check it out and then decide if for myself if I'm ok with it or not but having more options is something I would never complain about.

I was also considering to wait for the 5070 Super but eh that will most likely be way out of my budget range and I kind of want to sell off my 3060 Ti while it still worths something and its not dead.:laugh:
 
You mean FSR 3 frame generation? Because AFMF is AMD's global FG exclusive to Radeon. My experience with FSR FG has been extremely poor, but I only really tried it on FF16, where DLSS itself doesn't run too good.

Yes FSR 3! Guilty of not keeping up!

Everything I’m currently playing runs sweet at native. But if I end up on something that gives the 3080 a bit of stick, I might give FG a go. Then again, if its getting the stick usually no question asked, its 'upgrade time' rather than tweaking for scraps. Unless its more than just scraps.
 
Seems Nvidia is threatening the press again.

I don't get why. They are swimming in money. Then again, Intel showed similar behavior when they were quasi monopolist...
 
from what I understand multi frame gen high refresh looks/feels just as smooth as native high refresh
For the most part this is true. They're not real frames, they're just bitmaps of two frames blended together and sent to the display as a full frame. It is a bit more complicated than that but that's effectively what it is.

Looks, does not feel.
Um, no. I've played with this. It "feels" fine.
 
You mean FSR 3 frame generation? Because AFMF is AMD's global FG exclusive to Radeon. My experience with FSR FG has been extremely poor, but I only really tried it on FF16, where DLSS itself doesn't run too good.
I use AFMF and it works very well in some games and poorly in others, so it depends on the engine.
 
I use AFMF and it works very well in some games and poorly in others, so it depends on the engine.

I am not really fond of frame generation, but I agree. There are some games where it works quite well overall, but there are others that are an absolute mess. The worst implementation of frame generation I've ever seen in any game has gotta be FF16. FSR 3's just makes the game feel extremely stuttery, while DLSS has extreme ghosting and temporal artifacting. I forced myself to finish it on the RTX 4080.
 
1747696306914.png
 


The tech itself is super impressive but it's been used to justify the absolutely terrible generational gains of the 50 series on top of that Nvidia is now doing previews where they force outlets to use it to compare older gpus with which is beyond scummy.

I love the idea of frame generation but like with anything it's just been abused by developers who do not want to optimize their games and now even pushed on gamers as extra performance by Nvidia or else you won't get our drivers day 1 BS.

PC gaming is bigger than ever but I can honestly say on the games/hardware side it's worse than it's ever been and FG is part of that which is a shame because it should be a win more feature not a crutch...
 
Last edited:
Nvidia is now doing previews where they force outlets to use it to compare older gpus with which is beyond scummy.
They weren't forced they were offered money and chose to take the payoff for being one of nVideas bitches
 
Back
Top