• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

as an AMD fanboy, I gotta say Nvidia's multi frame gen is just lit, even at 1080p

No, was highlighting a specific idea and point. I do that sometimes. It was not intended to be a personal jab at you.
It came off as petty, which is bizarre for you, which is why I was kind of baffled at the comment. That's the only reason it really bothered me to be honest.

Glad we're clear :), apologies for the hostility. People want to drag these threads down very quickly.

FG was somewhat proposed for older hardware:

“I think this is primarily a question of optimization and also engineering and then the ultimate user experience, so we’re launching this frame generation, the best multi-frame generation technology with the 50-series, and we’ll see what we’re able to squeeze out of older hardware in the future - Bryan Catanzaro

Any update on whether the 30-series will get a taste of FG? Or maybe that ship has already sailed?
30 series getting frame gen kinda seems mute honestly, I dont see the point when the only games that are gonna be supporting frame gen by that point are newer games where 30 series will start getting less and less good base framerates to effectively use frame gen with. Not inherently a bad thing of course, but I just don't really see the point. I bet its gonna be a very low priority because of that fact imo, if it comes at all.
 
When Nvidia pushed FG, I was strongly against it, simply because it's fake frames that increase latency.
But when it became available for Radeon, I decided to test it just for my own knowledge, so I just plugged AFMF into the driver and it works where it can work. Just to mention that I haven't used FSR anywhere.
In some old single player games where the engines have an FPS limit, it worked quite well.

- In Blade of time, which is locked at 60 FPS, FG doubles it to 120 and that helps reduce the blur from the lower FPS on my 200Hz display, so it's more a plus than a minus.
- In Mass Effect (I think 1) it doubles the locked 240 FPS to 480, the slowdown effect that FG adds really helped me be more accurate with the sniper.
- In AC Mirage there are a few artifacts on some trees at long range, but nothing too bad and it helps me play the game at 4k virtual resolution with 180-220 FG FPS.
- In AC Valhalla the FG is pure garbage, everything blurs.
- In God of War, with some specific settings and 4k virtual resolution I was playing with 300-350 FG FPS, the game engine works very well with FG.
- In The First Descendant (MMO) avg 190-220 FG FPS. The game is playable, but in some heavy scenes probably the pure FPS get low and I see the blur. Slightly higher latency doesn't hurt because the game isn't that dynamic, plus I'm mainly playing with sniper weapons.

Etc...

Which just confirms to me that too low true FPS is a mandatory basis for FG performance. And Nvidia's gimmick that we can get 200++ FPS from base 40 FPS with low end cards is pure lie/marketing.
So the FG "technology" is more for high end cards, but not for lower end cards. And again, we need a monitor that can take the benefit of all that frames.

so it is more game dependent than anything, this is also worth noting

blade of time is an interesting example, locked fps games, that def could satisfy my itch i have with some of those games. I think Nier Automata is locked to 60 fps as well, that would be a game i'd want to try this on, I will try AFMF on my 7900 xt if i can figure it out and see if i feel the same thing on locked games, never thought of this before
 
Nvidia already doesn't pay any attention to the gaming segment, 5000 series have very little uplift from the 4000 series if you take away fake frames and upscaling, and there was no mention of the gaming market in their computex coverage. AMD would have to become a mutli-trillion dollar company in order to have any sort of monopoly on gaming GPU's, and you forget that Intel Arc exists.
And yet they have the 5 fastest cards on raster and the 7 fastest on rt.. Feels like everyone else is regressing.

Nvidia also offers more vram in lower segments which is a plus for those who care about vram. Below 300 euro only intel and nvidia offer more than 8 gb of vram
 
And yet they have the 5 fastest cards on raster and the 7 fastest on rt.. Feels like everyone else is regressing.
I could care less if only 1 of those cards were even remotely affordable, Nvidia no longer wants to provide real performance increases and is regressing through minimal performance uplifts over previous gen while selling performance based on frame gen. It seems like people forgot the "5070>4090" marketing BS.
Below 300 euro only intel and nvidia offer more than 8 gb of vram
And you're cherry picking again with a 12GB card from 5 years ago.
As for Intel, the B580 is selling for way above the MSRP, they would be solid competition if the B580 wasn't selling for nearly $400.
 
I could care less if only 1 of those cards were even remotely affordable, Nvidia no longer wants to provide real performance increases and is regressing through minimal performance uplifts over previous gen while selling performance based on frame gen. It seems like people forgot the "5070>4090" marketing BS.
Well the thing is, if nvidia no longer provides real performance increases how come nobody else even approaches top spots on performance charts? I've been hearing the same for at least the last 3 years. Nvidia offers no real performance increases. Well what the heck is everyone else doing during that time and nvidia, while offering no performance increases during this whole time has still the top 5-7 fastest cards? :eek:

And you're cherry picking again with a 12GB card from 5 years ago.
As for Intel, the B580 is selling for way above the MSRP, they would be solid competition if the B580 wasn't selling for nearly $400.
Do you understand what cherrypicking is? Im not. Im including 5 year old amd cards in there as well. They just don't have more than 8gb. Sadly if you need more vram at that price point, amd doesn't have an option. it's a stingy company when it comes to vram.
 
As for Intel, the B580 is selling for way above the MSRP, they would be solid competition if the B580 wasn't selling for nearly $400.
I`m not sure where you`re located, but I`ve just done a quick look on caseking and they have the Intel B580 in stock for €269 plus delivery. Lets hope the AMD 9060 series has a decent price and we get a bit of a price war, doubtful I know :(
 

Attachments

  • intel.png
    intel.png
    15.9 KB · Views: 17
For the best low latency framegen performance you need a decent base framerate combined with either Nvidia Reflex or AMD AntiLag 2
(and a high enough refresh rate monitor to take advantage of it). Reflex is great and it's supported in about 100 games. AL2 is functionally equivalent but it's only supported in 12 games because the previous version AL+ was withdrawn due to triggering anticheat. You can use AL2 with Reflex games using mods, but going through the Nvidia API means you also lose FSR framegen. Alternatively you can use LatencyFlex, but AL2 or Reflex are both better when available.
 
On that note, frame gen would be neat in palworld.
Would be nice id they'd actually fix DLSS in Palworld in the process too. it's been broken since launch, with it being nothing more than a resolution scale option that forgets to do any actual upscaling. The only way to actually get temporal upscaling in that game right now is TSR.
 
Would be nice id they'd actually fix DLSS in Palworld in the process too. it's been broken since launch, with it being nothing more than a resolution scale option that forgets to do any actual upscaling. The only way to actually get temporal upscaling in that game right now is TSR.

I'm glad you mentioned this, it's a good reminder that all of these software bonuses are very much game dependent in how well they work.
 
I disagree, I think AMD is going to use it's trickle down approach in a more expanded way, and not leave us in the dust. Take for example the 5600x3d, literally no reason to exist other than it was just enough bad binned 5800x3d's to make it happen, and I know this happens everywhere, but I think AMD takes it to the extreme. I think we will keep seeing this.

I wouldn't even bet my tea coaster on these types of short-lived outliers which are usually limited by stock-count/region/seller/etc. Similar to the more affordable AM5 entry OEM 7500F which could have easily been made a global release with the warranty intact. But, when the odd-balls do surface - keep em coming!

The GPU markets a different beast. Outside of gaming, the demands are far too strong for AMD to just sit on its hands, they’re clearly sharpening their blades with UDNA. Not for market share alone but for those fat-juicy high margins (if UDNA holds up). Give AMD a product that outpaces the competition and they’ll squeeze every drop out of it (eg. current Gen X3D CPUs). No doubt, the bigger Offender Nvidia has set the stage but AMD has shown no real signs of disrupting Nvidia's game by dropping products which are more attractively affordable. The usual play is to bump up prices, just enough to stay below Nvidias. On a positive note, when things don't work out as planned AMD has always been quick to trim the fat which is where the value-driven possibilities become more appealing (esp. 6000-series). I've had a blast upgrading a few machines and significantly bringing the cost down for family/friends/others with a bunch of 6600/6750XT/6800 cards. Unfortunately with todays inflated prices (/stock depletion), that is no longer possible.

and not leave us in the dust.

It won't be that bad.. but just a steeper price of entry. Today in the middle of 2025, they're pushing 8GB mainstream cards for around $300+. Thats where AMD is betting its horses with the 9600 XT and perhaps a later non-TI 5060 from Nvidia. Pants, if you ask me! Dirty soiled smelly skid-mark ridden pants which even the washing machine is denying entry... "BIN EM ALL", she said. I calmly and respectfully approached the angry banger and very politely asked "but why my lovely washing machine?" - she replied::nutkick:and said "stoopid". How painfully Rude!
 
I'm personally strongly against all this scaling and framegen bullshit when its whole purpose is just get around of game developer laziness of optimizing their games. Things were much better few years ago.
This is something that I see fairly often as an argument to bash on upscaling and whatnot but I don't agree with it.
Yes some developers do use it as an excuse for poor performance or the lack of optimization 'UE 5 for example' but please don't try to act as if we did not have terribly optimized games in the past even before any of this tech existed.
I've personally played multiple games for years that had complete dogshit of a performance and the developers couldn't be bothered to fix it for many years reagardless of the complaints.
We even had some of the biggest/most played games running like crap like WoW or Starcraft 2 that were bound by 1 CPU core/thread and that issue existed in many online games for a long time. 'Unluckily I've played a bunch of them and it was the same crap every single time..'

Nowadays its switched to being badly optimized on the GPU side of things mostly so ye the issue is not the existence of upscaling or framegen its all the same like the devs are using shitty engines that were badly coded in the first place and that they can't be arsed to fix them.
 
This is something that I see fairly often as an argument to bash on upscaling and whatnot but I don't agree with it.
Yes some developers do use it as an excuse for poor performance or the lack of optimization 'UE 5 for example' but please don't try to act as if we did not have terribly optimized games in the past even before any of this tech existed.
I've personally played multiple games for years that had complete dogshit of a performance and the developers couldn't be bothered to fix it for many years reagardless of the complaints.
We even had some of the biggest/most played games running like crap like WoW or Starcraft 2 that were bound by 1 CPU core/thread and that issue existed in many online games for a long time. 'Unluckily I've played a bunch of them and it was the same crap every single time..'

Nowadays its switched to being badly optimized on the GPU side of things mostly so ye the issue is not the existence of upscaling or framegen its all the same like the devs are using shitty engines that were badly coded in the first place and that they can't be arsed to fix them.
I can guarantee you: never before in history have we seen a situation this bad, where $2000–$3000 GPUs struggle to maintain a stable 60fps on high/max settings. And I’m not even talking about RT. So no, I don’t think it’s unfair to say that excessive gimmicks mask poor development practices and lead to underutilized hardware.

1747862935933.png

1747862918619.png
 
I can guarantee you: never before in history have we seen a situation this bad, where $2000–$3000 GPUs struggle to maintain a stable 60fps on high/max settings. And I’m not even talking about RT. So no, I don’t think it’s unfair to say that excessive gimmicks mask poor development practices and lead to underutilized hardware.

View attachment 400662
View attachment 400661
Of course we have. 1080ti at 1080p on games that are older than the card dropped to 20 fps. Now you might argue 1080ti wasnt 2k but then again, its 1080p vs 4k. Try ghost recon wildlands or watchdogs 2 maxed out. Oh booyyy
 
I can guarantee you: never before in history have we seen a situation this bad, where $2000–$3000 GPUs struggle to maintain a stable 60fps on high/max settings. And I’m not even talking about RT. So no, I don’t think it’s unfair to say that excessive gimmicks mask poor development practices and lead to underutilized hardware.

View attachment 400662
View attachment 400661
Can we stop focusing on 4k? Less than 6% of the gaming world is on that resolution. 70%+ of people are on 1080p.
 
I can guarantee you: never before in history have we seen a situation this bad, where $2000–$3000 GPUs struggle to maintain a stable 60fps on high/max settings. And I’m not even talking about RT. So no, I don’t think it’s unfair to say that excessive gimmicks mask poor development practices and lead to underutilized hardware.

View attachment 400662
View attachment 400661
Depends on the games you were playing, in my experience some games way before any of the upscaling tech were already destroying the performance of even the high end systems at their given time frame thanks to being shitty optimized.
I've been around for long enough to experience all of that and no the upscaling tech alone is not the only thing to blame in this current situation.
Also talking about under utilized hardware, oh boy we had it much worse before like I've said have fun with some of those old games that refused to utilize more than 1 CPU threads even tho 4 core CPUs were already kind of normal at the time yet the game refused to use any of it and the moment you had ~10 players on the screen it was sub 30 FPS regardless of your ingame settings.. 'Basically any of the old but played by millions MMOS..'
 
Last edited:
I try to play at 4K whenever I can, sometimes I have to use some trickery to get it done.. but the experience is no less satisfying most of the time..

I would rather do that then drop resolution..
 
Of course we have. 1080ti at 1080p on games that are older than the card dropped to 20 fps. Now you might argue 1080ti wasnt 2k but then again, its 1080p vs 4k. Try ghost recon wildlands or watchdogs 2 maxed out. Oh booyyy
I don’t think anyone would have had the courage to say that the games market during the 1xxx series was as bad as it is now.

WD2 ran at 1080p @ 60fps on very high settings with a $250 GTX 1060.
1070($379) was already enough for ultra.
We must not normalize the need for aggressive upscaling, let alone FG, just to run games on increasingly costly hardware.
1747863948060.png


1747863839917.png
 
I don’t think anyone would have had the courage to say that the games market during the 1xxx series was as bad as it is now.

WD2 ran at 1080p @ 60fps on very high settings with a $250 GTX 1060.
1070($379) was already enough for ultra.
We must not normalize the need for aggressive upscaling, let alone FG, just to run games on increasingly costly hardware.
View attachment 400664

View attachment 400663
These are not maxed out. Game had 2 sliders and pcss shadows. My 1060 was getting single digit framerates and my 1080ti around 20 to 25.

Funny thing is the graphs you posted are with upscaling, game had temporal filtering by default. Turn that off, max the sliders and welxome to 5 fps.
 
Funny thing is the graphs you posted are with upscaling, game had temporal filtering by default. Turn that off, max the sliders and welxome to 5 fps.
Can confirm, the game was notorious for running like absolute shit with it disabled. Frankly, while the current state of game releases is incredibly poor, I would not want to pretend like it is a massive downgrade from some golden era of perfect optimization - AAA games usually run poorly comparatively to the state of hardware at the time of their release. We just feel that it has exacerbated now because hardware is getting progressively more expensive and games are progressively harder to run for less and less visual benefits. We have basically crossed the line of performance-visuals balance for classical raster graphics and now, well, the next big jump in fidelity is actual full on path-tracing. I don’t need to explain how hard to run THAT would be.
Frankly, I have long advocated for a freeze on the graphics race for developers, both for the sake of users and themselves, but that isn’t happening anytime soon.
 
Can confirm, the game was notorious for running like absolute shit with it disabled. Frankly, while the current state of game releases is incredibly poor, I would not want to pretend like it is a massive downgrade from some golden era of perfect optimization - AAA games usually run poorly comparatively to the state of hardware at the time of their release. We just feel that it has exacerbated now because hardware is getting progressively more expensive and games are progressively harder to run for less and less visual benefits. We have basically crossed the line of performance-visuals balance for classical raster graphics and now, well, the next big jump in fidelity is actual full on path-tracing. I don’t need to explain how hard to run THAT would be.
Frankly, I have long advocated for a freeze on the graphics race for developers, both for the sake of users and themselves, but that isn’t happening anytime soon.
100% agreed, this is how I also feel about games for years now.
I'm not against development since I do use RT and whatever else in my games if possible but at this point I feel that the developers should focus on the games itself rather than trying to push the graphics further with zero foqs given regarding the real finished product's quality in overall.
Honestly I've been playing some free2play games for years now that don't bring anything to the table regarding the graphics and yet I find myself amazed about the ammount of passion the developers put into those games compared to some of the so called AAA developers and their games/end product.
 
I try to play at 4K whenever I can, sometimes I have to use some trickery to get it done.. but the experience is no less satisfying most of the time..

I would rather do that then drop resolution..
Yeah, but you have a 4k display. Most people don't yet.
 
Yeah, but you have a 4k display. Most people don't yet.
I suppose.. but this tv is just a cheapo from walmart.. I am sure most people are using monitors that probably cost twice as much..
 
Can we stop focusing on 4k? Less than 6% of the gaming world is on that resolution. 70%+ of people are on 1080p.
Except the enthusiasts here aren't 70% of users.
People keep missing the point when complaining at 4K benchmarks, testing at 4K is meant to push a card to its limits, and you can run most games at 1080p on almost any potato.
The point is, as already mentioned a $2000 GPU shouldn't be struggling to run modern games at 4K without upscaling and fake frames.
 
Except the enthusiasts here aren't 70% of users.
People keep missing the point when complaining at 4K benchmarks, testing at 4K is meant to push a card to its limits, and you can run most games at 1080p on almost any potato.
The point is, as already mentioned a $2000 GPU shouldn't be struggling to run modern games at 4K without upscaling and fake frames.
Maybe it's just a misunderstanding of how the render processes work. Not that I'm any professional, but any type of firmware or software that can generate additional frame rates and utilizing the it's hardware better shouldn't be considered "Fake" when it's using resources to make it happen. This is my only argument looking at it from a bias perspective.

Years ago, this technology was unheard of. Give me Extra FPS with my 7600GT? Call it what ever you want, 1024x768 was a native resolution at one time.

Why complain? Free FPS. Our technologies working at their finest. We should embrace it!
 
Back
Top