• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

as an AMD fanboy, I gotta say Nvidia's multi frame gen is just lit, even at 1080p

Lol, anything that's <150 FPS with multi frame generation means the base frame rate is like <40-50 FPS, that feels like dogshit.

The more frames are interpolated the more useless it becomes for lower framerates.
 
Lol, anything that's <150 FPS with multi frame generation means the base frame rate is like <40-50 FPS, that feels like dogshit.

The more frames are interpolated the more useless it becomes for lower framerates.

I think it's time we stop treating frame generation as a late 2000's smart TV's dumb frame interpolation, much as been done at the hardware and software level to mitigate that

arch22.jpg



arch24.jpg



arch26.jpg


That being said, I never used it because none of the games I play support it. Oblivion Remastered was the last game I purchased and it only supports Ada-level 2x FG and sadly, needs it plus DLSS P to retain performance, even on the 5090. Messy.
 
@Dr. Dro
I feel that the logic that is applied to FG is faulty by the developers and community both. I don’t see it as a way to “improve performance” - it can’t do that by definition in cases where it would be useful (playing demanding titles on mainstream hardware) and in cases where it DOES work well it’s unnecessary for this purpose already for the most part. What it IS is essentially a reaction to ever increasing monitor panel refresh rates, a tool to slightly increase fluidity and perceived motion performance. It’s basically that thing that BlurBusters were suggesting as a necessity for the future 1000Hz screens. It’s a filler for refresh in the titles that just aren’t feasible to run at above, say, 120 FPS, which is mighty respectable and “enough”, but why not pad it out for your new shiny 480Hz OLED. Of course, NV manipulating numbers and using it as a performance increasing feature (famously the 5070=4090 bullshit comes to mind) does NOT FUCKING HELP. It’s a crowbar being used as a flashlight - it’s inherently dishonest and misses the point.
 
@Dr. Dro
I feel that the logic that is applied to FG is faulty by the developers and community both. I don’t see it as a way to “improve performance” - it can’t do that by definition in cases where it would be useful (playing demanding titles on mainstream hardware) and in cases where it DOES work well it’s unnecessary for this purpose already for the most part. What it IS is essentially a reaction to ever increasing monitor panel refresh rates, a tool to slightly increase fluidity and perceived motion performance. It’s basically that thing that BlurBusters were suggesting as a necessity for the future 1000Hz screens. It’s a filler for refresh in the titles that just aren’t feasible to run at above, say, 120 FPS, which is mighty respectable and “enough”, but why not pad it out for your new shiny 480Hz OLED. Of course, NV manipulating numbers and using it as a performance increasing feature (famously the 5070=4090 bullshit comes to mind) does NOT FUCKING HELP. It’s a crowbar being used as a flashlight - it’s inherently dishonest and misses the point.

Exactly, and I'll go a step beyond... game developers are slowly but surely starting to count on it to achieve a decent performance level instead of optimizing. Just messy.
 
Different computer that someone else owned, was curious. I have not tried frame gen in particular with 5000 series though, I tried frame gen on a 4070Ti at a friends house, and was mixed to say the least.

I dunno. It doesnt feel the same to me at all, even with NVIDIA reflex overdrive. W1zard's game reviews as of late break down what he feels is 'good' latency w/ Framegen, you can see it in particular on his doom the dark ages testing.




(chart in the link provides context for this quote)
Latency really is the new metric now.. FPS is no longer reliable.
 
Latency really is the new metric now.. FPS is no longer reliable.
FPS was arguably always a terrible metric. The better one was and is frametimes and, yes, latency. Those are what you actually FEEL when playing.
 
Frame generation mainly helps high frame rate displays to display frames more smoothly, and this comes at the cost of true frames. It also doesn't work well with some game engines and creates artifacts.
So if you have a 360 Hz monitor, FG will be more useful than if you have a 120 Hz or even 60 Hz monitor.
The idea of FG is more closer to triple buffering than DLSS or more FPS.

The first generation FG that Nvidia announced was a good thing, but the second generation that throws in 3 extra FGs is just a marketing gimmick and doesn't help, and may make things worse. Unless of course you have a 500Hz monitor...
 
Using any form of FrameGen is nausea inducing to me, i cant even stand BFI same feeling
Now that is a valid argument. Luckily, FG is not on by default.

I know I said above that, considering how much I'm paying for the card, I'd rather have the option. I realize now I have no idea whether I will like looking at FG in action or not. Oh well, the proof is in the pudding, I guess.
 
I will say, normally with proprietary technologies (Especially with Nvidia) I am normally on the side saying how bad it is for the industry as they have a horrid history using it just to make games work worse on AMD products for marginal improvements (if any, gameworks anyone?). However, multi frame gen is probably the first time I have seen a tech come out I truly appreciate. It works fairly well and can make games much more playable with lower cards which is something I think we all appreciate. Its finally a tech that does not hamper usage on other cards, it just gives you extra performance on your card.

While I personally believe both sides Frame gen are very similar, Nvidias has more support currently. Their are drawbacks to it (Both sides), but I am really looking forward to seeing it improve over the years!
 
@Dr. Dro
I feel that the logic that is applied to FG is faulty by the developers and community both. I don’t see it as a way to “improve performance” - it can’t do that by definition in cases where it would be useful (playing demanding titles on mainstream hardware) and in cases where it DOES work well it’s unnecessary for this purpose already for the most part. What it IS is essentially a reaction to ever increasing monitor panel refresh rates, a tool to slightly increase fluidity and perceived motion performance. It’s basically that thing that BlurBusters were suggesting as a necessity for the future 1000Hz screens. It’s a filler for refresh in the titles that just aren’t feasible to run at above, say, 120 FPS, which is mighty respectable and “enough”, but why not pad it out for your new shiny 480Hz OLED. Of course, NV manipulating numbers and using it as a performance increasing feature (famously the 5070=4090 bullshit comes to mind) does NOT FUCKING HELP. It’s a crowbar being used as a flashlight - it’s inherently dishonest and misses the point.
Yes, but, i found in incredibly useful in badly written games that cpus struggle to get a consistent 60. Early hogwarts, jedi 2 etc. Its a tool to help with your cpu running out of steam, not so much for when you run out of gpu.

When you are gpu bound turning it on lowers your real frames even more, resulting in really bad latency in most games.
 
hmm, good to know, but I still think 5000 series card has actual hardware on it that is different, so it might feel different on a 5000 series card. we need to someone who has owned both 4000 and 5000 to comment lol

@oxrufiioxo haven't you owned both?

I've had about 10 hours hands on with a 5090 and 5070ti but I don't like frame generation in 2x mode let alone 4x mode for me it's easy to feel the latency difference and I can easily pick out when it artifacts even at 100fps base frame rate.

From my brief hands on while the new MF generation does make the image smoother noticeably so it also has more artifacts and worse latency.

I like frame generation one way with a controller and at at least 8 feet away from the image on a decent size oled for me it unusable with a mouse and keyboard at 2-3 foot viewing distance and that doesnt change regardless of how many frames are interpolated.

But like with anything that's a personal thing everyone needs to try it out themselves.
 
I need another coffee.

When you can not clearly explain what your tech does, you have a problem.

I'm not sure if I saw today that topic anyway in the form as the gamers nexus or hardware unboxed video.

This all sounds like marketing for fanboys to myself. Too complicated. A little bit was explained in the hardware unboxed video from today. They explained how and why nvidia enforced the tech scenario for the 4 times fake frame vs cards which can not do that to get big numbers for the 5060 8gb card.
 
When you can not clearly explain what your tech does, you have a problem.
NVidia has a lot of problems, but that’s not one of them. They ALWAYS release whitepapers on their tech.



There is no point to try to explain any of this to a complete layman Joe Six Pack because for him the specifics are irrelevant anyway.
 
Honestly? Stop praising this. The right comparison is 200 real frames versus 200 generated frames. Producing 200fps from a 50-60 FPS base doesn't match the quality and low latency of native ones. Period. I think this discussion has gone far enough... any further and people might start buying into Jensen’s fantasy that the 5070 = 4090, courtesy of the Fake Factory ×4.
 
Frame gen is maybe OK for ppl with slower brains, who don't feel the mouse <-> picture latency, so maybe older folks or I don't know. Or maybe ppl playing with gamepads or heavy mouse smoothing for some terrible reason. The latency stops being noticeable (for normal healthy ppl) somewhere above 100 *real* FPS, where you already do not need more FPS. Below that, it helps only visually, but latency-wise, there is this disgusting feeling that the mouse is connected to the PC not by a cable carrying electric signals, but by a rubber band or something. FG = useless.
 
I wonder how many consumers are able to read. Not the four words cheap newspaper reading skills.

Post #1, Hardware unboxed video, Gamestar justification which hardware unboxed called out.

Comparing old graphic cards which can not use muliti frame generation 4 times to a card which can is nonsense. I quote hardware unboxed wiht my own words.

#39

Thanks for the marketing hoax, which is several years old pdf file.


Page 15: Nonsense. AS a gentoo guy I know h264 and AV1 are video codecs. I doubt many people know that. So you get another compression speed and another picture quality using another Codec. A codec is mathematics how you do things with your moving picture material. Nonsense

2) It looks old stuff. It's only about dlss3

page 10: i doubt that is a fact - that pdf looks like a nvidia marketing. I doubt the responsiveness and smoothness and the image quality. I doubt that this is really true. Without ever used such a card a single second ever. I'm kinda sure that stuff will cause artifacts and not responsiveness.

page 8: Taking an aeroplane game which most likely needs or has slow input response implemented. They should take something which needs very fast human to computer interaction.

edit: to make it 100% clear for post #43

Page 1: How Ada advances the science of graphics with DLSS 3.
DLSS3 = old technology years old

My response was to that "NVidia has a lot of problems, but that’s not one of them. They ALWAYS release whitepapers on their tech." and my statement -- if you are unable to explain. Do not take it personally, when you provide me with a bad document. That is a marketing hoax document. But not a datasheet for electronics or some other proper tech pdf file. It'S marketing. I'm sure that pdf content I may find it quite fast on the net as the usual marketing hoax sliedshow.
 
Last edited:
Thanks for the marketing hoax, which is several years old pdf file.
- Where’s the explanation for how X feature works?
- *gets provided an explanation through an official whitepaper NV provides to researchers*
- This is nonsense, a marketing hoax.

Okay brother.

The rest is not even worth addressing, the “I know better than NV engineers cause I use Linux (Gentoo BTW)” is pure cope. As for the “it’s old, DLSS4 only” complaint, there is literally a second link provided fully dedicated to DLSS4 features.

i doubt that is a fact - that pdf looks like a nvidia marketing. I doubt the responsiveness and smoothness and the image quality. I doubt that this is really true. Without ever used such a card a single second ever. I'm kinda sure that stuff will cause artifacts and not responsiveness.
O…kay? I thought you wanted to know how it works. Even NV does not claim that the FG is flawless - it inherently cannot be. I am not sure what this even has to do with anything.
 
Okay brother.
The average gamer, even a majority of enthusiasts aren't going to be reading corporate white papers.
O…kay? I thought you wanted to know how it works. Even NV does not claim that the FG is flawless - it inherently cannot be. I am not sure what this even has to do with anything.
The issue is most people don't know or care how it works, heck I've seen people here act like it's "free FPS" with no consequences to latency or game smoothness at all.
 
The average gamer isn't going to be reading corporate white papers.
…and? The fact that the “average gamer” is an illiterate dumbfuck isn’t really anyone’s problem other than said “average gamer’s”.

The issue is most people don't know or care how it works, heck I've seen people here act like it's "free FPS" with no consequences to latency or game smoothness at all.
Most people don’t know or care how anything related to tech works. This has nothing to do with NV or FG or DLSS specifically.
 
Computerbase did some MFG latency testing in DOOM The Dark Age and the results speak for themselves.

doom.jpg


9070XT with FSR3.Q achieve higher FPS (than 5070Ti + DLSS4.Q) but latency is just comparable to 5070Ti with DLSS4.Q + MFG 3x, meaning if MFG has unplayable latency, so is the best of Radeon (even with FSR enabled).

So I guess the average gamer should stop listening to rage bait media outlets that are HUB and GN and try out MFG for themselves LOL
 
honestly anything past 2x FG is so brutally laggy.

I love the idea of FG and I use it whenever I can, but in some games (Remnant 2, Cyberpunk) it's way more fun just to turn down some settings. Works well for hogwarts and some others though.
 
- Where’s the explanation for how X feature works?
- *gets provided an explanation through an official whitepaper NV provides to researchers*
- This is nonsense, a marketing hoax.

Okay brother.
Is this your first time talking to him? :D
 
Low quality post by evernessince
Nvidia according to HUB and GN gives access to the drivers for 5060 reviews if you benchmark at 1080P, 4xMFG and selected games only.
With that fulfilled you can compare to others running the same settings, the takeaway here is to NOT trust the first round of reviews!


EDIT: I had a link to a youtube post from GN that I removed after realising it might be a bit of flame encouraging, but I was to slow, sorry =)
Also, the link you posted is from a site that is SPONSORED BY NVIDIA to do benchmarks the way Nvidia sees fit (for example, not being allowed to test vs. the 40-series).
More info on the whole debacle below (posting 3 different Techtubers since Nvidia shills don't like "biased" ones - for them, anyone who criticizes Nvidia is biased anyway :laugh: ):


Better watch out mentioning this, TPU's mods deleted a thread late last night linking GN's video. It might get by being in an thread titled to favor Nvidia though.

Ironically is was by Space Lynx as well. Honestly I'm not sure this thread really is about DLSS FG being great, only that saying so would attract attention and people who would post the video. It makes it harder to delete, so well played getting around TPU's Nvidia censorship.
 
from what I understand multi frame gen high refresh looks/feels just as smooth as native high refresh
I've been using MFG on Nvidia since the 4070's launch and it absolutely does not feel as smooth as native high refresh. Motion fluidity is definitely an improvement, but you can tell it's not the same as native high refresh because certain things are only being updated at half the speed - most notably anything temporal such as RT denoising or ambient occlusion, screen-space effects like SSAO and SSR, as well as all of the artifacts from errors in the fake frames that "flicker" at half the MFG framerate (when using 2x frame-gen).

If you're a 60Hz or 75Hz gamer then you probably cannot feel the difference between 120Hz input lag and 240Hz input lag. If you've been a high-refresh gamer for several years already, the sluggishness and feel of MFG is notably worse than even the base framerate.

Lets say you have a system that can render at 100fps natively, or 185fps using 2x MFG. That 185fps is a base framerate of 92fps, but because of the additional processing that MFG entails, there's some additional lag on top of what a native 92fps experience would normally be. It's going to feel like the game is running at about 75fps natively, killing the entire point of a high-refresh experience for so many people.
 
Back
Top