• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Big Navi Performance Claims Compared to TPU's Own Benchmark Numbers of Comparable GPUs

From looks of it, it will contest with rtx 3700 not against 3800. Basically a 2080 ti competitor. Obviously we know very little about it and power consumption etc... but Lisa said it is big navi, therefore it is safe to assume that it will be its leading card. Unless they come up with uber navi etc.

Well, if they can average only a 10% performance deficit, and knowing the 3080 has zero OC headroom to speak of (even an undervolt is good on it), they're actually pretty competitive with a 3080. Now if they also provide 12-16GB to overtake the 10GB of the 3080... I think I'd know what card to pick, even at 650-720 ish for the AMD one. That's definitely competing with 3080, even if the net perf is a bit lower. It'll likely age a lot better and could be another 680 vs 7970 moment.

I'm hyper critical of what's on the table usually, but this looks pretty good. If the TDP is kept in check and it runs smoothly.... AMD has a great release on their hands. And let's pray for no driver or BIOS oopsies this time. Or at least a rapid and definitive hotfix scenario. That's a big thing, and RDNA1 didn't do it favors.

We also don't know how close the 3070 will get to 2080ti performance but I reckon they will match it. Not handily beat it - that only happens situationally. In that case... its certainly not competing with 3070, its on another tier entirely. This is definitely shaping up to be on the upper end of what I had expected from RDNA2.

Additionally, I think its realistic to expect this not to be a full chip, so there's that too.
 
Last edited:
I think Nvidia estimated correctly what AMD will launch based on what they did with the consoles and they pushed the hell out of that silicon reaching a new record of GPU power consumption.
If you now build a computer with a 10900k and RTX 3080 you can skip buying a heater in the winter, these two can actually heat up the room :).
 
If it's around 5% slower than 3080 now, then it will eventually be 5% faster with mature driver in later. Plus, there is no way Lisa Su will show their top dog this early, it's literally a jebait. If it's a 64 or 72CU card then it's GG for Nvidia.
The fine wine argument again, that's a nonsensical argument, and more than a little misleading.
Buy an inferior product today, which may be superior sometime later, but it never happens.
Nvidia's drivers improve performance just as much as AMD's over time.

AMD did trick nvidia last time, they duped them on a couple things if I recall correctly. On pricing. so it is possible this isn't their top dog card. lol we will see soon enough
Even if they did, I don't believe misleading competitors or consumers is a good thing. We need healthy competition and fair play.

Well, if they can average only a 10% performance deficit, and knowing the 3080 has zero OC headroom to speak of (even an undervolt is good on it), they're actually pretty competitive with a 3080.
If the performance lands between RTX 3070 and RTX 3080 and the price does too, then it's fair. But if it's less performant at the same price, then people shouldn't buy it.
Even if "big Navi" happens to have more OC headroom, that shouldn't be a general sales argument, most gamers want stable computers for gaming, not overclocking for setting records.

Now if they also provide 12-16GB to overtake the 10GB of the 3080... I think I'd know what card to pick, even at 650-720 ish for the AMD one. That's definitely competing with 3080, even if the net perf is a bit lower. It'll likely age a lot better and could be another 680 vs 7970 moment.
Aah, the eternal "future proofing" argument; buy inferior stuff with superior "specs" that will win over time. :rolleyes:
Our best guidance is always real world performance, not anecdotes about what will perform better years from now.
 
The fine wine argument again, that's a nonsensical argument, and more than a little misleading.
Buy an inferior product today, which may be superior sometime later, but it never happens.
Nvidia's drivers improve performance just as much as AMD's over time.


Even if they did, I don't believe misleading competitors or consumers is a good thing. We need healthy competition and fair play.


If the performance lands between RTX 3070 and RTX 3080 and the price does too, then it's fair. But if it's less performant at the same price, then people shouldn't buy it.
Even if "big Navi" happens to have more OC headroom, that shouldn't be a general sales argument, most gamers want stable computers for gaming, not overclocking for setting records.


Aah, the eternal "future proofing" argument; buy inferior stuff with superior "specs" that will win over time. :rolleyes:
Our best guidance is always real world performance, not anecdotes about what will perform better years from now.

You're forgetting that even at 720 they will still undercut Nvidia on MSRP for any AIB version really. Like you say. If the price is right, as always. Thing is, I'm not liking the balance the 3080 has with core / VRAM cap. You're at liberty to think otherwise, but in a similar way, the 680 was in a remarkably similar place. I didn't pull that comparison out for nothing. We're at the eve of a new console gen here and those also carry somewhat higher VRAM capacities than Nvidia's finest, and I think it needs no discussion that the 2GB Kepler /refresh cards were obsolete faster than many would have liked. Faster than I liked personally at least - I bought a 770 and did replace with a 780ti only to get the 3GB VRAM as 2GB was falling short even at 1080p, for quite a few titles. Only one gen later the VRAM caps doubled and for good reason - the consoles had more too.

Another thing of note is TDP. I'm not a huge fan of anything north of 220-250W. Air will struggle with that.

Ironically, the best handle on future proofing is gained by looking back. History repeats.
 
Last edited:
You know why AMD ages ( fine wine....) better than Nvidia ? because Nvidia stops optimizing drivers for older cards and AMD doesn't.
They can afford this tactic because they rule desktop market.
New games will widen the gap between old and new more and more.
 
You know why AMD ages ( fine wine....) better than Nvidia ? because Nvidia stops optimizing drivers for older cards and AMD doesn't.
They can afford this tactic because they rule desktop market.
New games will widen the gap between old and new more and more.

Wrong, both competitors keep adding support for new titles on all active GPU families.
 
You're forgetting that even at 720 they will still undercut Nvidia on MSRP for any AIB version really.
There are plenty of slid AiB models at the same MSRP as Nvidia's founders edition ($700).

Like you say. If the price is right, as always. Thing is, I'm not liking the balance the 3080 has with core / VRAM cap. You're at liberty to think otherwise, but in a similar way, the 680 was in a remarkably similar place.
And I always say the truth is in good (real world) benchmarking, not in anecdotes about how many cores, GBs, ROPs, TMUs etc. feels right.
As I've told you before, the fact remains that increasing VRAM usage nearly always requires both more bandwidth and computational performance in order to utilize it, which is why you usually will hit other bottlenecks long before VRAM.

We're at the eve of a new console gen here and those also carry somewhat higher VRAM capacities than Nvidia's finest
Not really, that's combined system RAM and VRAM, so it's not a fair comparison.

Another thing of note is TDP. I'm not a huge fan of anything north of 220-250W. Air will struggle with that.
No objection there. ~250W seems to be the spot where cooling becomes a hassle.

You know why AMD ages ( fine wine....) better than Nvidia ? because Nvidia stops optimizing drivers for older cards and AMD doesn't.
They can afford this tactic because they rule desktop market.
Exactly how many years do you have to wait then?
The fact is Nvidia retains legacy support, AMD drops it.
AMD can't even get their current driver support right, how can they be better at legacy support?

New games will widen the gap between old and new more and more.
Yeah right.
I'm sure those old Radeon 200/300 will finally be unleashed any day now. :rolleyes:
 
Another thing of note is TDP. I'm not a huge fan of anything north of 220-250W. Air will struggle with that.
Actually I'm ok at the 300~320W draw as I owned a R9 390X from 2015 to 2018. But I'm also case-less so that helps.
 
Wrong, both competitors keep adding support for new titles on all active GPU families.
Yeah but the new gpu gets the better treatment, just look at horizon zero dawn which launched a bit before rtx 3080/3090, perfect candidate for a "driver fix", i think in this title rtx3080 distances the most from 2080ti, not so much in old games.
Driver tricks will make RTX 3000 more and more appealing in new games.
I agree efikkan, AMD sucks, they will launch a mediocre GPU with bad drivers at first and probably consume a lot of power, but, they won't handicap a good card to force you to buy a new generation because they already suck and they don't care much about this market, they care about the millions of orders from Microsoft and Sony much more than the very very few "enthusiast" that might buy these high end cards, that's why first drivers will be so bad you are lucky if you boot into windows, why bother for practically no money compared to what they sell to consoles.
They are here to piss off Nvidia a bit but not really compete.
 
Yeah but the new gpu gets the better treatment, just look at horizon zero dawn which launched a bit before rtx 3080/3090, perfect candidate for a "driver fix", i think in this title rtx3080 distances the most from 2080ti, not so much in old games.
Driver tricks will make RTX 3000 more and more appealing in new games.
I agree efikkan, AMD sucks, they will launch a mediocre GPU with bad drivers at first and probably consume a lot of power, but, they won't handicap a good card to force you to buy a new generation because they already suck and they don't care much about this market, they care about the millions of orders from Microsoft and Sony much more than the very very few "enthusiast" that might buy these high end cards, that's why first drivers will be so bad you are lucky if you boot into windows, why bother for practically no money compared to what they sell to consoles.
They are here to piss off Nvidia a bit but not really compete.
Okay so interesting way to spend a Friday, I disagree with some of that.
No sorry all of it.

Opaquifying the B's for mortals, good work.
 
I'm not sure this is cherry picking. Many of the games chosen favor Nvidia, not AMD. The same could be said of AMD's recent Zen 3 announcement. Far Cry Primal for example heavily favors Intel yet AMD choose to use it.

Of course I am going to wait for 3rd party benchmarks but it's nice to see a company that isn't trying to insult your intelligence with "2X the Performance!!!!! *At 8K with RTX enabled vs a 2080"
 
I just checked on AMD's bicycles. They are sold out as well, lmao. wow. just wow.
 
Some of you are really overstating the importance of having the fastest GPU. 3080 is so expensive that it won't matter to most gamers. Ever..
I'll go for bang for buck as always
 
Some of you are really overstating the importance of having the fastest GPU. 3080 is so expensive that it won't matter to most gamers. Ever..
I'll go for bang for buck as always

Why do you have a RX 5700 when the RX 580/570 have way better bang for buck ? asking for some consistency in your reasoning here.
 
Why do you have a RX 5700 when the RX 580/570 have way better bang for buck ? asking for some consistency in your reasoning here.

Because the RX 5700 was a good bang for your buck GPU as well.
 
Because the RX 5700 was a good bang for your buck GPU as well.

Saying you only going for bang for buck meaning you would choose the highest bang for buck GPU right ?

But when you would like to have more performance while having good bang for buck, then the 3080 is just as good bang for buck as the 5700, they come to around the same FPS/dollar (provided you can buy a 3080 right now :D).

So had @Kaleid say "Most people can't afford the 3080", then I wouldn't have any issue with that, since Ampere is rolling out at every price point just like Turing anyways.
 
Saying you only going for bang for buck meaning you would choose the highest bang for buck GPU right ?

But when you would like to have more performance while having good bang for buck, then the 3080 is just as good bang for buck as the 5700, they come to around the same FPS/dollar (provided you can buy a 3080 right now :D).

So had @Kaleid say "Most people can't afford the 3080", then I wouldn't have any issue with that, since Ampere is rolling out at every price point just like Turing anyways.
Not even close to same regard. Especially depending on location and prices. My location doesn't have a 3080 below $1000 cad. And since the two cards are of two different generations, it isn't worth comparing the two since the RX 5700 was released to compete around RTX 2060.

Better wait and see RDNA 2 results to compare better bang for your buck.
 
Some of you are really overstating the importance of having the fastest GPU. 3080 is so expensive that it won't matter to most gamers. Ever..
I'll go for bang for buck as always

So where do they cut cost? Performance inherently improves price/perf. They can't just sell cards at any price they want and earn a profit. There has to be some engineering improvement to undercut NVIDIA if the performance is similar and even more so if it's lower.

The cards are in NVIDIA'S hand until they release the 2060
 
Could it be? A decent Radeon release after so long?
Nah, they will find a way to screw it up, and it better not be disabling ray tracing due to "reasons".
 
Not even close to same regard. Especially depending on location and prices. My location doesn't have a 3080 below $1000 cad. And since the two cards are of two different generations, it isn't worth comparing the two since the RX 5700 was released to compete around RTX 2060.

Better wait and see RDNA 2 results to compare better bang for your buck.

Bang for buck is only an excuse for lower performing products. Zen CPU has gone from best bang for buck to being premium CPU now. If AMD had the best GPU they would be pricing them accordingly. At the end of the day nobody want to get paid minimum wages :laugh: .

I reckon RDNA2 better has 30% more perf/dollar than Ampere in rasterization, only then it makes sense overlooking its lack of finesses in other area (RT/DLSS/HW Encoding/Drivers/Reflex)
 
Why do you have a RX 5700 when the RX 580/570 have way better bang for buck ? asking for some consistency in your reasoning here.

I'm also...obviously aiming for a certain performance at a certain resolution.
Besides, I bought the 5700 used so i saved 100 USD doing so the 580 wouldn't have any chance when it comes to bang for the buck.
And since 3080 is inserted into the discussion, as it also provides lot of the money. Sure, but since I'm at 1440p it's performance would be utterly wasted with my monitor.

AMD raised the 5000 series CPU prices but I'm quite certain that very few will expect AMD to do the same with their new GPUs.

I'll change GPU next year, but not until I get to see MS direct storage in action.
 
what? They didn't tout it as such - in their own slides.

That really became where the card shine after it was released tried it in both 1440p and 4K compared to my old GTX 1080 Ti.
 
hmm,when you get off amd's cherry picked amd games,ryzen 5900 cpu help and also 16gb memory help,specially just 4K resolution its help,looks clear that rtx 3080 is 20-25% faster than big navi.

so, when choose 20gb rtx 3080 and using same cpu ryzen 5900 for test...no doubt.


but, sure,its seen so clear,..amd cant make any miracle its building...still old parts and usual updates.
i mean rrx 6900 xt is 2 x 5700xt with CF mode, so rx 6900 xt cant be more than that for performance...and amds tdp value info for that performance, 300W is joke.
i say we took over 350W.and i mean if it get even 80% rtx 3080 speed.

interesting is also RT speed..and that category amd is child situation...also,RT support btw, eat more juice... 25W?


we can imagine..erh, or say wait this review Techpowerup:


RTX 3080 FE 10GB/20GB + ryzen 5900 cpu VS RX 6900 XT 16GB + ryzen 5900 cpu

i can bet 25% win for rtx 3080...and i add at least 20% more different when rtx 3090/ryzen 5900 cpu put against rx 6900 xt.

i see that rtx 3000 series, its highers gpus need powerfull cpu for partner... now i say ryzen 5900 or so, later intel 10nm and of coz 7nm one.


hmm rtx 3070/TI against rx 6900 xt....well, sure is that rtx 3070 is cheaper, and might fight hard against rx 6900 xt.... loose 10% and be 200$ cheaper....?

let see...waiting test.. only few weeks!


also,is it nice to get battleline also intel Xe gpu, then we have 3 gpus for soap!
 
I just checked on AMD's bicycles. They are sold out as well, lmao. wow. just wow.
Well here's hoping AMD gpus aren't as shitty as those bikes.
 
I think Nvidia estimated correctly what AMD will launch based on what they did with the consoles and they pushed the hell out of that silicon reaching a new record of GPU power consumption.
If you now build a computer with a 10900k and RTX 3080 you can skip buying a heater in the winter, these two can actually heat up the room :).
Stick the water cooler radiators out the window near heat loving plants; or use a themocouple to convert heat to run fans; or use it to boost cold water temps for your water heater...all sorts of fun things to do with that heat :)
 
Back
Top