• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

as an AMD fanboy, I gotta say Nvidia's multi frame gen is just lit, even at 1080p

They weren't forced they were offered money and chose to take the payoff for being one of nVideas bitches

Probably, but regardless if money changed hands or they couldn't get day one drivers unless doing ridiculous preview comparisons I still lay the blame squarely on Nvidia. I'm shocked any of them thought it would go down well for them.

Is it really that hard for a 3 trillion dollar company to just make a product that's so good it speaks for itself. I guess in 2025 it apparently is.
 
i am still hopeful that by end of 2026 AMD will have a 3nm node flagship with FSR4 and FSR4 will have lot of games by then and be more polished.


I love having an all AMD rig, had 0 issues with it, i just hope AMD doesn't give up on the flagship, i don't think they will, it's just taking a little break from it all and focusing on budget/midrange, which is completely fine.
 
i am still hopeful that by end of 2026 AMD will have a 3nm node flagship with FSR4 and FSR4 will have lot of games by then and be more polished.


I love having an all AMD rig, had 0 issues with it, i just hope AMD doesn't give up on the flagship, i don't think they will, it's just taking a little break from it all and focusing on budget/midrange, which is completely fine.

UDNA will be interesting for sure, I'm mildly pessimistic only because I believe they made the switch with AI in mind and not gaming. I guess in 2026 we will see.

I think AMD will do a flagship when they know they can sell one at a decent margin why make a massive 600mm2 chip and only be able to get 10-15% margins on it vs selling epyc CCDs at a massive markup.
 
The tech itself is super impressive but it's been used to justify the absolutely terrible generational gains of the 50 series on top of that Nvidia is now doing previews where they force outlets to use it to compare older gpus with which is beyond scummy.

I love the idea of frame generation but like with anything it's just been abused by developers who do not want to optimize their games and now even pushed on gamers as extra performance by Nvidia or else you won't get our drivers day 1 BS.

PC gaming is bigger than ever but I can honestly say on the games/hardware side it's worse than it's ever been and FG is part of that which is a shame because it should be a win more feature not a crutch...

Blackwell reminds me a lot of Zen 5, both sought to refine the previous generation, but their primary focus just isn't the consumer business. Pretty valid upgrades in their own right, but it's unsurprising that in most cases, if you place a 4080 Super and a 5080 beside each other, you'll, more often than not, be completely unable to tell them apart.
 
Forget the input lag, I personally can't stand the artifacting on DLSS FG (or FSR3 FG for that matter), much less MFG4x despite having a 5070 Ti. The artifacts are just too obvious- Frame gen doesn't cope with overlayed UI elements well at all in any game I've tested (ESPECIALLY translucent UI elements that remove the FG on any pixels they cover)- just today I tried out the latest Portal RTX update, MFG artifacts everywhere. Walking over one of the grates is a flickery, interpolated mess, even just moving the camera and looking at reflections on the pod in Testchamber 0 reveals plenty of artifacts. This is at 4K with DLSS4 Quality.

As much as I'd love to say "It's free performance", I can't stand it and never use it for more than 2 minutes. Hopefully the artifacts improve in the future, I know Reflex 2 is supposed to help with the latency aspect at the very least, maybe they'll focus on improving the interpolation itself too.
 
Blackwell reminds me a lot of Zen 5, both sought to refine the previous generation, but their primary focus just isn't the consumer business. Pretty valid upgrades in their own right, but it's unsurprising that in most cases, if you place a 4080 Super and a 5080 beside each other, you'll, more often than not, be completely unable to tell them apart.


They are really good at premier pro both Blackwell builds I've done so far were not for gaming unsurprisingly. I did at the buyers permission take both for a spin in gaming for a couple hours it technically counts as stress testing lol.

They honestly remind me a lot of Turing although I think the 2080ti impressed me more than the 5090 did although I only got to try out 3 games with it and it was paired with a 285K that was holding it back at 1440p with slow ram due to them needing 128GB of it.... I never thought I would see a system chug nearly 1100w with a single gpu lmao. last build I did that pulled that much power was a Tri fire 290x build lol
 
I never thought I would see a system chug nearly 1100w with a single gpu lmao. last build I did that pulled that much power was a Tri fire 290x build lol
My word there must be more going on there than a 5090 and 285k though right? together at absolute peak sustained draw that should be ~825w, <~275w feels like a lot for the rest of the components, fans etc. Also seems improbable during a gaming workload, or was it just in pure synthetic stress testing you achieved that?

I certainly agree the 5090 isn't all that impressive relative to a 2080Ti that did something new, or a 4090 that was a massive performance leap from a 3090, relative to that the 5090 is just "more" and not all that exciting.
 
UDNA will be interesting for sure, I'm mildly pessimistic only because I believe they made the switch with AI in mind and not gaming. I guess in 2026 we will see.

I think AMD will do a flagship when they know they can sell one at a decent margin why make a massive 600mm2 chip and only be able to get 10-15% margins on it vs selling epyc CCDs at a massive markup.

Yep, seems like it!

Back in 2024, AMDs data centre revenues tipped over 50% (of total revenue) and Lisa Su went on saying AMD is now a "data first company". Agreed, UDNA is just that - a stronger focus on data centre applications and AI which will no doubt drive prices up.

Us gamers - we'll get some of those good value UDNA scraps from the get-go but as soon as AMD is in a better position to compete in those higher margin and rapidly expanding markets, blood will be drawn. There ain't no friends in this game, just friends with benefits until the benefits are more rewarding elsewhere.
 
My word there must be more going on there than a 5090 and 285k though right? together at absolute peak sustained draw that should be ~825w, <~275w feels like a lot for the rest of the components, fans etc. Also seems improbable during a gaming workload, or was it just in pure synthetic stress testing you achieved that?

I certainly agree the 5090 isn't all that impressive relative to a 2080Ti that did something new, or a 4090 that was a massive performance leap from a 3090, relative to that the 5090 is just "more" and not all that exciting.

synthetic, at the wall bounced between 1000-1050w with power limits removed. During gaming it was 700-750w at the wall.

My 4090/7950X3D same workload 800w at the wall just for comparison or 650w during gaming.

Both mine and the 5090 build were tuned down quite a bit without losing hardly any performance though the 5090 does fine in premier pro and topaz ai at much lower wattage.
 
Last edited:
Yep, seems like it!

Back in 2024, AMDs data centre revenues tipped over 50% (of total revenue) and Lisa Su went on saying AMD is now a "data first company". Agreed, UDNA is just that - a stronger focus on data centre applications and AI which will no doubt drive prices up.

Us gamers - we'll get some of those good value UDNA scraps from the get-go but as soon as AMD is in a better position to compete in those higher margin and rapidly expanding markets, blood will be drawn. There ain't no friends in this game, just friends with benefits until the benefits are more rewarding elsewhere.

I disagree, I think AMD is going to use it's trickle down approach in a more expanded way, and not leave us in the dust. Take for example the 5600x3d, literally no reason to exist other than it was just enough bad binned 5800x3d's to make it happen, and I know this happens everywhere, but I think AMD takes it to the extreme. I think we will keep seeing this.
 
I'm personally strongly against all this scaling and framegen bullshit when its whole purpose is just get around of game developer laziness of optimizing their games. Things were much better few years ago.
 
I'm personally strongly against all this scaling and framegen bullshit when its whole purpose is just get around of game developer laziness of optimizing their games. Things were much better few years ago.

I think they are preparing us for node shrinkage no longer giving us gains. This is the future in 5-10 years whether we like it or not, unless IBM comes out with their non-silicon light based nano tube designs
 
UDNA will be interesting for sure, I'm mildly pessimistic only because I believe they made the switch with AI in mind and not gaming. I guess in 2026 we will see.

I think AMD will do a flagship when they know they can sell one at a decent margin why make a massive 600mm2 chip and only be able to get 10-15% margins on it vs selling epyc CCDs at a massive markup.
It's really scaring me to think that nvidia will be so big in AI that they will stop paying any attention to the gaming segment. AMD being a monopoly in gaming GPUs, oh god, they have proven repeatedly how user friendly they are . We will be toast :eek:
 
Um, no. I've played with this. It "feels" fine.
His statement implied it feels just as smooth as having that exact same framerate; that objectively isnt true. You can get pretty close, if you tweak settings, which of course your gonna wanna do with FG & MFG, but thats what that statement is referring too. Perhaps you should read the full context before replying?

Never said it felt bad; I personally don't like it, but I never once stated that is the case for everyone. Again; I'm all for it in singleplayer games. I think its pretty cool in that regard. Would I use it personally? No, I stated this before. But I think there IS valid use cases for it and I've pointed these out.

Frame generation is cool and it makes for nice visual step ups.
It still needs a decent base frame rate and acceptable low dips to generate from and it eats up VRAM in a way that might be an issue for some.
Not sure how much VRAM it uses up but definitely needs a decent base framerate of like 60 or higher to get good results, latency wise AFAIK. On said example with base FG with that 4070Ti I got to test, they had a base frame-rate of about 80 / 90 FPS in my very limited testing. I cant remember for the life of me what game though it was.. I wanna say diablo 4 but I'm not sure..

What is not cool is Nvidia trying to make it look like the new native and even less cool is Nvidia trying to force reviewers to adopt their overreaching marketing narrative supporting these performance claims.
The 5070 != 4090 by MFG and the politics around the 5060 reviews seems downright horrible, I know that the overreacting, posturing and being a bit of a fanboy is part of the game and adds to the revenue stream, but in the end if there is no independent reviews, us the consumers will be the ones hurting

So MGF is a good thing, DLSS is a good thing, but the way it is portrayed by marketing at the moment is not.
Its how they're implemented and utilized is what matters. If its not implemented or utilized well it will just die off as a 'gimmick' and go the ways of crossfire / sli. That being said, I dont think frame generation is going anywhere unless we keep putting it in games like Marvel Rivals, where it really, really has no place being.

This thread is for trolling
I think there's a valid discussion to be had if people can just avoid getting their knickers in a twist when someone says something they don't agree with. Too many people are ready to completely jump on someone and throw out their opinion; whether it has valid points or not, because they said something they do not agree with.

I'd like to have a actual discussion w/ people who do not like FG (such as myself) and people who do, and instead of having a pissing contest we actually have a constructive conversation about it. But it seems that wont be the case. In the end, it just devolves into a reddit thread pretty much.

Some of it is definitely marketing; its not a end all be all improvement, but there is MANY valid use cases for it, and I personally would never go as far to say its 'plays like absolute garbage'. NVIDIA Reflex exists afterall.
 
Last edited:
i hate fake frames, i will not change and i will die on this hill, he shouldn't have to do this if games weren't so poorly optimised and Nvidia/AMD so greedy and they didn't treat the people that made their companies what they are today as second class customers. I'm also not a competitive player so most games over 80 fps for me are just fine, hardly ever push the full 165fps my monitor can handle as i just try to push graphics quality a bit further if i can.

Anyway, but i recently tried a 5070ti at a friends house in a couple of games, and i got to say the multi fake frame stuff and i was impressed, not sure if things would be the same if it was a 5060 (for me it will always be called a 5050 but i digress) and we were doing the 20 to 200 fps but still i can't deny reality. It's impressive.
 
When Nvidia pushed FG, I was strongly against it, simply because it's fake frames that increase latency.
But when it became available for Radeon, I decided to test it just for my own knowledge, so I just plugged AFMF into the driver and it works where it can work. Just to mention that I haven't used FSR anywhere.
In some old single player games where the engines have an FPS limit, it worked quite well.

- In Blade of time, which is locked at 60 FPS, FG doubles it to 120 and that helps reduce the blur from the lower FPS on my 200Hz display, so it's more a plus than a minus.
- In Mass Effect (I think 1) it doubles the locked 240 FPS to 480, the slowdown effect that FG adds really helped me be more accurate with the sniper.
- In AC Mirage there are a few artifacts on some trees at long range, but nothing too bad and it helps me play the game at 4k virtual resolution with 180-220 FG FPS.
- In AC Valhalla the FG is pure garbage, everything blurs.
- In God of War, with some specific settings and 4k virtual resolution I was playing with 300-350 FG FPS, the game engine works very well with FG.
- In The First Descendant (MMO) avg 190-220 FG FPS. The game is playable, but in some heavy scenes probably the pure FPS get low and I see the blur. Slightly higher latency doesn't hurt because the game isn't that dynamic, plus I'm mainly playing with sniper weapons.

Etc...

Which just confirms to me that too low true FPS is a mandatory basis for FG performance. And Nvidia's gimmick that we can get 200++ FPS from base 40 FPS with low end cards is pure lie/marketing.
So the FG "technology" is more for high end cards, but not for lower end cards. And again, we need a monitor that can take the benefit of all that frames.
 
I disagree, I think AMD is going to use it's trickle down approach in a more expanded way, and not leave us in the dust. Take for example the 5600x3d, literally no reason to exist other than it was just enough bad binned 5800x3d's to make it happen, and I know this happens everywhere, but I think AMD takes it to the extreme. I think we will keep seeing this.
AMD doesn't go to any more extremes with cpu's than Intel does with a cut down core 3 or core 5. The trickle down approach with UDNA is manufacturing efficiency of having a single line of GPU's capable of both gaming and compute, enough people whined about AMD gpu's not having compute so we'll likely end up paying more for gaming cards.
It's really scaring me to think that nvidia will be so big in AI that they will stop paying any attention to the gaming segment. AMD being a monopoly in gaming GPUs, oh god, they have proven repeatedly how user friendly they are . We will be toast :eek:
Nvidia already doesn't pay any attention to the gaming segment, 5000 series have very little uplift from the 4000 series if you take away fake frames and upscaling, and there was no mention of the gaming market in their computex coverage. AMD would have to become a mutli-trillion dollar company in order to have any sort of monopoly on gaming GPU's, and you forget that Intel Arc exists.
 
Nvidia already doesn't pay any attention to the gaming segment, 5000 series have very little uplift from the 4000 series if you take away fake frames and upscaling, and there was no mention of the gaming market in their computex coverage. AMD would have to become a mutli-trillion dollar company in order to have any sort of monopoly on gaming GPU's, and you forget that Intel Arc exists.

and AMD pays much attention to gaming? the 9700xt is 100eur less than the 5070ti here with worst RT performance, the same old song we've seen countless times.
Nvidia f'd up the drivers, sure, there is that, but that doesn't mean they can't fix it or they abandoned gaming, just as it didn't meant that when AMD had the shittiest drivers. Aren't we exaggerating a bit?

The vram thing Nvidia has been doing that even before AI was a buzzword.
 
and AMD pays much attention to gaming? the 9700xt is 100eur less than the 5070ti here with worst RT performance, the same old song we've seen countless times.
Nvidia f'd up the drivers, sure, there is that, but that doesn't mean they can't fix it or they abandoned gaming, just as it didn't meant that when AMD had the shittiest drivers. Aren't we exaggerating a bit?

The vram thing Nvidia has been doing that even before AI was a buzzword.
Um what is in the PS5, Xbox1, Steam Deck. ROG Ally. Legion Go and even now the MSI Claw? Those all sound like Gaming devices to me.
 
and AMD pays much attention to gaming? the 9700xt is 100eur less than the 5070ti here with worst RT performance, the same old song we've seen countless times.
Nvidia f'd up the drivers, sure, there is that, but that doesn't mean they can't fix it or they abandoned gaming, just as it didn't meant that when AMD had the shittiest drivers. Aren't we exaggerating a bit?

The vram thing Nvidia has been doing that even before AI was a buzzword.
AMD pays much more attention to gaming than Nvidia has been, cards with better price to performance value, more VRAM, and the drivers being bad hasn't been true for years. I'm assuming the EU market was flooded with geforce cards, while the NA market had more radeon 9070 and 9070XT cards, before the market instability the 9070XT was $150-200 less than a 5070Ti and retailers couldn't keep them in stock.
The point of Nvidia f%$^ing up the drivers so badly is a sign they don't care, and they're probably using AI to write drivers.

Anyway, my opinion is frame gen could have been a useful technology, but AAA game companies are using it to launch un-optimized games, and now Nvidia wants reviewers to be saying its free performance or else they won't get drivers on launch day. Thanks to nvidia we have a bunch of proprietary feature BS, which has only made gaming worse, instead of improving native resolution they want to sell performance upscaled from 1080p and fake frames.
 
I understood the context perfectly. :rolleyes:
So you're saying you understood the context perfectly and cherry picked your quote, leaving out the rest of the context? Okay? Be that way I guess.
Come On Shrug GIF


Incredibly strange coming from you especially. If you understood the context, then you know what my opinion is, and ours aligns more than you think. I'm just more realistic than optimistic. If you wanna write my opinion off, then sure. But atleast I'm not trying to demonize frame gen as some evil hellspawn NVIDIA created to lie to the masses when its just a cool piece of tech that has some asteriks to it.

On that note, frame gen would be neat in palworld.

I certainly agree the 5090 isn't all that impressive relative to a 2080Ti that did something new, or a 4090 that was a massive performance leap from a 3090, relative to that the 5090 is just "more" and not all that exciting.
On a off topic note, this isnt too surprising to be fair, and the 5090 is still a fair bit more generational uplift compared to the other 50 series cards. So far my opinion still remains the same about 50 series, only a 5070Ti at MSRP and a 5090 are really worth grabbing, both for different reasons.

I'm still surprised how quickly the 3090 got surpassed. The 4090 is gonna be remembered well because of its huge uplift compared to the 3090 I think. Even though it didnt do anything new or exciting (if we aren't talking about frame gen, which honestly, the 4090 for its time didnt even need the extra frames imo.)
 
Back
Top