• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

as an AMD fanboy, I gotta say Nvidia's multi frame gen is just lit, even at 1080p

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
18,226 (4.70/day)
Location
Kepler-186f
Processor 7800X3D -25 all core ($196)
Motherboard B650 Steel Legend ($189)
Cooling RZ620 (White/Silver) ($32)
Memory 32gb ddr5 (2x16) cl 30 6000 ($80)
Video Card(s) Merc 310 7900 XT @3200 core -.75v ($705)
Display(s) Agon QHD 27" QD-OLED Glossy 240hz ($399)
Case NZXT H710 (Black/Red) ($62)
Power Supply Corsair RM850x ($109)

Imagine building a 1080p high refresh rig and you drop $150 for 280hz 1080p monitor, and $299 for a rtx 5060 gpu that can do this with multi frame gen on: get the fuck out bruh, 12yr old me would be ecstatic

1747642160070.png


from what I understand multi frame gen high refresh looks/feels just as smooth as native high refresh

honestly nothing wrong with 23.8" high refresh 1080p gaming, i don't think i could enjoy 25" or 27" 1080p, but yeah i know most of us here are using higher resolutions, i'm just saying try to put yourself in a budget mindset, that's damn impressive for $299 (considering the market for 1440p high refresh or higher gaming is much much more expensive

cause this rig could even rock a used budget 5700x3d super cheap too. etc
 
from what I understand multi frame gen high refresh looks/feels just as smooth as native high refresh
Looks, does not feel. Depends heavily on your base FPS though. 60 FPS or more is recommended for both FSR and DLSS frame gen. I've tried both and dont care for either. If you want a decent experience, you can turn on NVIDIA Reflex to help with the issues of latency with frame gen on nvidia's side of things. It will still have generally worse latency than native though, and some people can notice it extremely easily (like me), and others cannot. I'm sure its nice for older gamers too.

Honestly, you just need to find what your comfortable with, and use it. I don't like frame gen, not necessarily because I don't like the tech (I think its neat), but the implications that it could mean for the already diminishing returns on generational uplift we've been seeing on average.

I think framegen is perfectly okay in games like Cyperpunk 2077 but honestly should have no place in games like Marvel Rivals. Singleplayer games are perfect for framegen, whereas framegen in compeitive games.. yucky. I dont think I need to explain why its bad for compeitive games.
 
Looks, does not feel. Depends heavily on your base FPS though. 60 FPS or more is recommended for both FSR and DLSS frame gen. I've tried both and dont care for either. If you want a decent experience, you can turn on NVIDIA Reflex to help with the issues of latency with frame gen on nvidia's side of things. It will still have generally worse latency than native though, and some people can notice it extremely easily (like me), and others cannot. I'm sure its nice for older gamers too.

Honestly, you just need to find what your comfortable with, and use it. I don't like frame gen, not necessarily because I don't like the tech (I think its neat), but the implications that it could mean for the already diminishing returns on generational uplift we've been seeing on average.

I think framegen is perfectly okay in games like Cyperpunk 2077 but honestly should have no place in games like Marvel Rivals. Singleplayer games are perfect for framegen, whereas framegen in compeitive games.. yucky.

you tried both with a rtx 2080 gpu? you can't compare DLSS4 multi frame gen advancements to your experience unless your experience was on a 5000 series card with latest drivers.

from reviews i read it does feel the same, i have never experienced it though. I do agree its only for single player games though.
 
you tried both with a rtx 2080 gpu? you can't compare DLSS4 multi frame gen advancements to your experience unless your experience was on a 5000 series card with latest drivers.
Different computer that someone else owned, was curious. I have not tried frame gen in particular with 5000 series though, I tried frame gen on a 4070Ti at a friends house, and was mixed to say the least.
from reviews i read it does feel the same, i have never experienced it though. I do agree its only for single player games though.
I dunno. It doesnt feel the same to me at all, even with NVIDIA reflex overdrive. W1zard's game reviews as of late break down what he feels is 'good' latency w/ Framegen, you can see it in particular on his doom the dark ages testing.


Due to the fast gameplay and the fact that you'll be doing flick-shots very often, I'd recommend a latency of 30 ms or less. At 35 ms I can start feeling the added lag. While it's still perfectly playable, I definitely prefer latency sub-30 ms over smoother frames here.

(chart in the link provides context for this quote)
 
you tried both with a rtx 2080 gpu? you can't compare DLSS4 multi frame gen advancements to your experience unless your experience was on a 5000 series card with latest drivers.
It doesn’t matter. There is no magic trick that can reduce latency for frame gen versus native. It might “feel” the same with a controller because the input is smooth always just due to the nature of analog sticks for camera control, but anyone remotely sensitive to latency will be able to tell FG and MFG from native with a mouse basically 10 times out of 10.

think framegen is perfectly okay in games like Cyperpunk 2077 but honestly should have no place in games like Marvel Rivals. Singleplayer games are perfect for framegen, whereas framegen in compeitive games.. yucky. I dont think I need to explain why its bad for compeitive games
The problem with MR is that it runs like absolute dogshit otherwise even on good hardware. The fact that NetEase thought that performance in a fast paced competitive shooter is acceptable boggles the mind, especially when compared to the main competitor, Overwatch, which, for all its flaws, has an excellent engine.
 
Different computer that someone else owned, was curious. I have not tried frame gen in particular with 5000 series though, I tried frame gen on a 4070Ti at a friends house, and was mixed to say the least.

I dunno. It doesnt feel the same to me at all, even with reflex overdrive.

hmm, good to know, but I still think 5000 series card has actual hardware on it that is different, so it might feel different on a 5000 series card. we need to someone who has owned both 4000 and 5000 to comment lol

@oxrufiioxo haven't you owned both?
 
We really need to talk about "Throughput vs Performance"
 
It doesn’t matter. There is no magic trick that can reduce latency for frame gen versus native. It might “feel” the same with a controller because the input is smooth always just due to the nature of analog sticks for camera control, but anyone remotely sensitive to latency will be able to tell FG and MFG from native with a mouse basically 10 times out of 10.
This is what it was like for me. Even with low sensitivity I could feel the latency. NVIDIA Reflex only does so much afterall, but it is nice it atleast exists. I actually tend to use it on cyberpunk without framegen funnily enough.

As for FSR Frame Gen, its deifnitely worse than NVIDIA's but atleast for people interested IN framegen theres a option out there for them to use. I still am not using it though.

The problem with MR is that it runs like absolute dogshit otherwise even on good hardware. The fact that NetEase thought that performance in a fast paced competitive shooter is acceptable boggles the mind, especially when compared to the main competitor, Overwatch, which, for all its flaws, has an excellent engine.
I mean the game is riddled with weirdness. Had supposedly both a american and chinese development team and the entire american part was laid off from what the grapevine told me. Take that with salt.

Overwatch in comparison has a pretty good engine, I wouldnt say excellent (I reserve that for like IDTECH and etc.)

hmm, good to know, but I still think 5000 series card has actual hardware on it that is different, so it might feel different on a 5000 series card. we need to someone who has owned both 4000 and 5000 to comment lol
It certainly (atleast from my uneducated opinion, going off the hardware) probably doesnt feel too different. It could, especially since 5000 series was targeted towards AI, I'm not sure, as again, never tried MFG w/ 5000 series. Just regular FG on 4000 series.

@oxrufiioxo haven't you owned both?
Doesnt just have to be them, we can get multiple people on this who owned / tried both. Lets just try to not devolve this into a huge toxic thread..
 
Overwatch in comparison has a pretty good engine, I wouldnt say excellent (I reserve that for like IDTECH and etc.)
Nah, it actually is very very good and relatively undervalued. It’s very consistent and tight with frametimes and has flawless input even implementing sub-tick for mouse movement and click actions, which very few games do. As far as competitive FPS go, it’s basically the best engine currently, funnily enough. Siege runs Anvil which was never really meant for such cases and even after tons of upgrades remains iffy (at least they decoupled mouse input from framerate, that was terrible), Valorant is just UE4 and bruteforces things via 128 tick servers and CS2’s Source 2 is plagued with terrible 1% lows and some elements that hold it back, like it’s AnimGraph system being completely unsuitable for a competitive FPS, which even Valve admitted and works on potentially replacing.
 
Nah, it actually is very very good and relatively undervalued. It’s very consistent and tight with frametimes and has flawless input even implementing sub-tick for mouse movement and click actions, which very few games do. As far as competitive FPS go, it’s basically the best engine currently, funnily enough. Siege runs Anvil which was never really meant for such cases and even after tons of upgrades remains iffy (at least they decoupled mouse input from framerate, that was terrible), Valorant is just UE4 and bruteforces things via 128 tick servers and CS2’s Source 2 is plagued with terrible 1% lows and some elements that hold it back, like it’s AnimGraph system being completely unsuitable for a competitive FPS, which even Valve admitted and works on potentially replacing.
To be perfectly fair I never got the chance to get hands on with the performance of OW2 because I was only particularly interested in OW1, and that game got replaced by what many consider to be a objectively worse game, and I lost interest in hero shooters after.

I'll definitely have to compare with some of my favorite optimized engines, such as IDTECH7 (one of my favorites)

Anyway, were getting off topic.
 
Last edited:
hmm, good to know, but I still think 5000 series card has actual hardware on it that is different, so it might feel different on a 5000 series card. we need to someone who has owned both 4000 and 5000 to comment lol

@oxrufiioxo haven't you owned both?
People have a knee-jerk reaction about latency, you'd think everyone is a world champ at esports. Most games don't give a crap about latency, they play just fine.
The problem (as I see it) with the tested setup is super-resolution. Pre-Blackwell upscaling from resolutions lower than FHD resulted in blockyness/artifacts. And I don't think Blackwell changed that. But I will agree MFG opens up some possibilities that just weren't there before.
 
Frame generation is cool and it makes for nice visual step ups.
It still needs a decent base frame rate and acceptable low dips to generate from and it eats up VRAM in a way that might be an issue for some.

What is not cool is Nvidia trying to make it look like the new native and even less cool is Nvidia trying to force reviewers to adopt their overreaching marketing narrative supporting these performance claims.
The 5070 != 4090 by MFG and the politics around the 5060 reviews seems downright horrible, I know that the overreacting, posturing and being a bit of a fanboy is part of the game and adds to the revenue stream, but in the end if there is no independent reviews, us the consumers will be the ones hurting

So MGF is a good thing, DLSS is a good thing, but the way it is portrayed by marketing at the moment is not.

Nvidia according to HUB and GN gives access to the drivers for 5060 reviews if you benchmark at 1080P, 4xMFG and selected games only.
With that fulfilled you can compare to others running the same settings, the takeaway here is to NOT trust the first round of reviews!


EDIT: I had a link to a youtube post from GN that I removed after realising it might be a bit of flame encouraging, but I was to slow, sorry =)
 
Last edited:
People have a knee-jerk reaction about latency, you'd think everyone is a world champ at esports. Most games don't give a crap about latency, they play just fine.
The problem (as I see it) with the tested setup is super-resolution. Pre-Blackwell upscaling from resolutions lower than FHD resulted in blockyness/artifacts. And I don't think Blackwell changed that. But I will agree MFG opens up some possibilities that just weren't there before.

the article i linked in post 1 said Doom Dark the Ages looked amazing so i dunno i was just going based off that, cause the article is about a 1080p rig.

Frame generation is cool and it makes for nice visual step ups.
It still needs a decent base frame rate and acceptable low dips to generate from and it eats up VRAM in a way that might be an issue for some.

What is not cool is Nvidia trying to make it look like the new native and even less cool is Nvidia trying to force reviewers to adopt their overreaching marketing narrative supporting these performance claims.
The 5070 != 4090 by MFG and the politics around the 5060 reviews seems downright horrible
I know that the overreacting and posturing is part of the game and adds to the revenue stream, but in the end if there is no independent reviews, us the consumers will be the ones hurting

EDIT: I hope It's fair to link that in.

i just need to experience a rtx 5000 series rig next to my my current 7900 xt rig and decide for myself. not sure when that will happen though. prob not for a couple years but eh
 
my current 7900 xt
Only worth it if money is a non-issue and you find yourself unreasonably bored, and/or needing more GPUs. Otherwise, wait till something actually good comes out. Blackwell is a fiasco and a half and RDNA4 isn't doing anything about it, either.

With base framerates like that (5060 is gonna be roughly 6700 XT / 3060 Ti level of performance), MFG isn't going to make things magically better. No. 299 dollars is too much for this hardware no matter what nVidia or your local store tell you. And I haven't even started on the VRAM thing...
 
Ill be honest here, i don't get the frame generation at all for any GPU that can render it 1:1 and if you can't then its perhaps time to upgrade.
If you aim for high refresh rate gaming you really don't want any generated frames, especially on esports games.
If you aim for high fidelity gaming, you don't want frame generated frames, they look worse (always).

If you cant get the frames necessary for those two, then upgrade your GPU.
 
I will get back to you once I get my hands on that 5070 in a few weeks since I'm also playing DOOM currently so it will be still on my PC at the time.
Luckily I'm not sensitive to latency so I could see myself being fine with MFG now that I have a 200Hz monitor. 'not that I can notice any difference over 120 but thats beside the point:oops:'
 
I'm still waiting for a blind test where 2 PC running with 360hz monitors, one with 4x MFG and one without FG, and see where people find which PC offer the better gaming experience.

Right now it's just personal opinions coming from the hardware reviewers who hardly play game who are against MFG
 
...

So MGF is a good thing, DLSS is a good thing, but the way it is portrayed by marketing at the moment is not.

...
Eh, when was the last time something came out of marketing that was useful, informative and to the point? :P
 
The lag was supposed to be halved at 2X, but increases at 3x and 4x. Now that paints a different picture.
 
Last edited:

Imagine building a 1080p high refresh rig and you drop $150 for 280hz 1080p monitor, and $299 for a rtx 5060 gpu that can do this with multi frame gen on: get the fuck out bruh, 12yr old me would be ecstatic

View attachment 400278

from what I understand multi frame gen high refresh looks/feels just as smooth as native high refresh

honestly nothing wrong with 23.8" high refresh 1080p gaming, i don't think i could enjoy 25" or 27" 1080p, but yeah i know most of us here are using higher resolutions, i'm just saying try to put yourself in a budget mindset, that's damn impressive for $299 (considering the market for 1440p high refresh or higher gaming is much much more expensive

cause this rig could even rock a used budget 5700x3d super cheap too. etc
Mfg x4 is useless. Has too many artifacts and it doesnt even make sense to use, since in order for it to make any sense your base framerate must be obnoxiously low.

Mfg3 and 2 are decent. When you are running at 80+ fps the latency is negligible, between 50 and 70 it is noticeable but for single player non fps games its usable.

It works like a charm in games you are cpu bound, it doesn't work that well when you are gpu bound. So no, getting a 5060 and going x4 framegen ain't really a great idea
 
I've tested FG both from AMD (6800 XT, FSR and AFMF), Nvidia (friends RTX 4070) and Lossless Scaling.

No matter which version, the game felt smoother but played like garbage. If you're using a mouse and keyboard, the input lag is terrible. Playing a game with a controller feels better but you can still feel the increased input latency is there. And no, neither AMD Anti-lag or NVIDIA Reflex helped with this.

Bottom line, FG and MFG felt smoother than native, yes, but played way worse. I'd pick native over FG anyday.

Also, the link you posted is from a site that is SPONSORED BY NVIDIA to do benchmarks the way Nvidia sees fit (for example, not being allowed to test vs. the 40-series).
More info on the whole debacle below (posting 3 different Techtubers since Nvidia shills don't like "biased" ones - for them, anyone who criticizes Nvidia is biased anyway :laugh: ):
 
Ill be honest here, i don't get the frame generation at all for any GPU that can render it 1:1 and if you can't then its perhaps time to upgrade.
If you aim for high refresh rate gaming you really don't want any generated frames, especially on esports games.
If you aim for high fidelity gaming, you don't want frame generated frames, they look worse (always).

If you cant get the frames necessary for those two, then upgrade your GPU.
FG is there for when your GPU doesn't cut it anymore, but you're out of kidneys so you can't buy a new one either.
 
FG is there for when your GPU doesn't cut it anymore, but you're out of kidneys so you can't buy a new one either.
I understand what i said maybe sounding cruel, but this is the reality and always has been.
At 1080p you can get by with cheap gpus now days that can play almost all games at smooth FPS.
FG is a sacrifice i am not willing to accept, which is why i got my first AMD card when i build my most recent PC (and been way happier with the so called 'shit' AMD drivers in comparison to NVIDIAS).
Its mostly marketing so the manufacturers can justify the price tag while offering worse performance increments than previous generations.
 
I understand what i said maybe sounding cruel, but this is the reality and always has been.
At 1080p you can get by with cheap gpus now days that can play almost all games at smooth FPS.
FG is a sacrifice i am not willing to accept, which is why i got my first AMD card when i build my most recent PC (and been way happier with the so called 'shit' AMD drivers in comparison to NVIDIAS).
Its mostly marketing so the manufacturers can justify the price tag while offering worse performance increments than previous generations.
Not with RT you can't.
FG is a (mostly) software feature. Considering I'm already paying an arm a leg, I'd rather have that feature checked on the box.

We've had this discussion before. For a while people swore by SSAA pointing out MSAA is not "true AA". Did the same thing with AF optimizations. The approximated method always won. It simply got better over time till people have even forgotten abut the original ways. Love it or hate it, FG will be no different.
 
Using any form of FrameGen is nausea inducing to me, i cant even stand BFI same feeling
 
Back
Top