• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Big Navi Performance Claims Compared to TPU's Own Benchmark Numbers of Comparable GPUs

While I have nothing to say about the numbers but I can say the BL3 presentation was using the built in benchmark I recognized it.
 
Yeah but the new gpu gets the better treatment, just look at horizon zero dawn which launched a bit before rtx 3080/3090, perfect candidate for a "driver fix", i think in this title rtx3080 distances the most from 2080ti, not so much in old games.
Driver tricks will make RTX 3000 more and more appealing in new games.
I agree efikkan, AMD sucks, they will launch a mediocre GPU with bad drivers at first and probably consume a lot of power, but, they won't handicap a good card to force you to buy a new generation because they already suck and they don't care much about this market, they care about the millions of orders from Microsoft and Sony much more than the very very few "enthusiast" that might buy these high end cards, that's why first drivers will be so bad you are lucky if you boot into windows, why bother for practically no money compared to what they sell to consoles.
They are here to piss off Nvidia a bit but not really compete.

No. Just:.. no, sorry. You can leave this drivel elsewhere.

There are plenty of slid AiB models at the same MSRP as Nvidia's founders edition ($700).


And I always say the truth is in good (real world) benchmarking, not in anecdotes about how many cores, GBs, ROPs, TMUs etc. feels right.
As I've told you before, the fact remains that increasing VRAM usage nearly always requires both more bandwidth and computational performance in order to utilize it, which is why you usually will hit other bottlenecks long before VRAM.


Not really, that's combined system RAM and VRAM, so it's not a fair comparison.


No objection there. ~250W seems to be the spot where cooling becomes a hassle.


Exactly how many years do you have to wait then?
The fact is Nvidia retains legacy support, AMD drops it.
AMD can't even get their current driver support right, how can they be better at legacy support?


Yeah right.
I'm sure those old Radeon 200/300 will finally be unleashed any day now. :rolleyes:

You can keep repeating the blanket statements but Im referring specifically to the bakance of this particular 3080. Its a typical Nvidia move to place it like this and after the 670/680/770 and numerous other examples Im pretty capable of predictions in this regard.

The gpu has shit balance, like it or not, and 700 isnt exactly a steal for such product.
 
Last edited:
So much ignorance in this thread, one thing that i loved though, is all the discussion was civil as possible. Now, if anyone here understands marketing then you know this... that was not AMDs best foot forward. I'd be worried if I'm Nvidia(more so if that's a 64CU card). People love to talk about track record(especially at AMD haven't competed in the for years) , but fail to realize AMDs recent track record they "jebaited" several times recently, Zen 2, 5700xt, 5600xt, technically Zen3.

Whether we agree or disagree Lisa Su and Co have shown that they can deliver on their promises and I for one clearly think that AMD would be considered morons to have a separate event to release a "flop" of a gpu. It's clear to me and a select few that they've got something special on their hands and are being real coy about it. Do you think the teaser at the end wasn't on purpose? They've been doing it all along. Look at the rumor mill when it came to Zen 3 vs RDNA2. It was clear to me what was prioritized and where. While Zen 3 is a big deal, RDNA2 is where the statement needed to made.

IIRC no one here mentioned that these cards are cheaper to make on 7nm overall vs Ampere. These cards will be very efficient and priced well. AMD also is doing what very few companies have done, be honest. These benchmarks clearly shows AMD used games they struggle with (both CPU & GPU). Seems to me like they are showing worse case scenarios to gauge the internets reactions and it's working. The GPU event was obviously last for a reason, they have the fastest CPUs in town and thus needed the GPUs to be tested with the best. This also shows their ecosystem and lastly to surprise all the naysayers who can't seem to understand that they can compete. RDNA1 matched Turings IPC and Nvidia regressed in that department with Ampere.

A "halo" product ain't a damn 3070 competitor.
Why is that so hard for fanboys to understand?
 
Last edited:
Whether we agree or disagree Lisa Su and Co have shown that they can deliver on their promises and I for one clearly think that AMD would be considered morons to have a separate event to release a "flop" of a gpu. It's clear to me and a select few that they've got something special on their hands and are being real coy about it. Do you think the teaser at the end wasn't on purpose? They've been doing it all along. Look at the rumor mill when it came to Zen 3 vs RDNA2. It was clear to me what was prioritized and where. While Zen 3 is a big deal, RDNA2 is where the statement needed to made.

IIRC no one here mentioned that these cards are cheaper to make on 7nm overall vs Ampere. These cards will be very efficient and priced well. AMD also is doing what very few companies have done, be honest. These benchmarks clearly shows AMD used games they struggle with (both CPU & GPU). Seems to me like they are showing worse case scenarios to gauge the internets reactions and it's working. The GPU event was obviously last for a reason, they have the fastest CPUs in town and thus needed the GPUs to be tested with the best. This also shows their ecosystem and lastly to surprise all the naysayers who can't seem to understand that they can compete. RDNA1 matched Turings IPC and Nvidia regressed in that department with Ampere.

A "halo" product ain't a damn 3070 competitor.
Why is that so hard for fanboys to understand?

Fury X - Flopped
Vega - Flopped
Radeon VII - Flopped
AMD did have special events annoucing all of those GPU mind you. Radeon VII in particular flopped so hard it was dead on arrival.

Well considering Navi10 is slightly faster than TU106 (2070), it is a remarkable improvement already when Navi21 is faster than GA104, I will give you that. GA102 however is a different beast.

And Navi21 is cheaper to produce than GA102 ? do you know that AMD pay 9k usd per 7nm wafer at TSMC vs Nvidia pay only 3k usd per 8N wafer at Samsung ? GA102 is dirt cheap to produce, Nvidia has like 60%+ profit margin afterall (last quarter was 66%).

Navi21 best silicon will be reserved for prosumer cards where AMD make more profit. So expect cut down Navi21 competing and beating GA104, which is still good.
 
Will it have good Vulkan driver for Red Dead Redemption 2 or will freeze like the previous generation?
 
No. Just:.. no, sorry. You can leave this drivel elsewhere.
I understand, some people can't comprehend they can do that, handicap cards from drivers, neah, impossible, corporations operating as a monopoly when they don't have any competition, this is nonsense.
While some people here can't get around this, the majority don't see a reason to ever upgrade beyond a GTX 1060 or something equivalent from Amd until it's dead, lack of good games makes PC so unattractive and the rare good ones are so well made they don't need RTX 3080 or whatever Amd makes, just look at recent star wars squadrons or previous ones like jedi fallen order, runs fine on gtx 1060 and equivalent AMD, big publishers like EA or blizzard.... they don't care about GPU sales, it really hurts Nvidia when they put something like that on the market, imagine gaming on a 4 year old card with all settings turned up and graphics are top notch :laugh:
 
If they get stock before my 3080 eventually arrives (late NOVEMBER! FOR A PREORDER!) i'll just damn well switch to team red again.
 
6900xt about 2% faster than 3080, and $50 cheaper

3080 oc is exactly 200% 5700xt
6900xt is double 5700xt and plus
 
I understand, some people can't comprehend they can do that, handicap cards from drivers, neah, impossible, corporations operating as a monopoly when they don't have any competition, this is nonsense.
While some people here can't get around this, the majority don't see a reason to ever upgrade beyond a GTX 1060 or something equivalent from Amd until it's dead, lack of good games makes PC so unattractive and the rare good ones are so well made they don't need RTX 3080 or whatever Amd makes, just look at recent star wars squadrons or previous ones like jedi fallen order, runs fine on gtx 1060 and equivalent AMD, big publishers like EA or blizzard.... they don't care about GPU sales, it really hurts Nvidia when they put something like that on the market, imagine gaming on a 4 year old card with all settings turned up and graphics are top notch :laugh:

Did it occur to you I never even asked you to provide a source for these old and tired claims? And that you never provided one either?

This is because I have already seen them all and none of them holds any water whatsoever. You can spare yourself the trouble. And again.. you can either learn a thing or two here or you can live in fantasy land, but again Id say, do it elsewhere. Ill torch any nonsense I see and so far you've been full of it.

Consider carefully what you might post next. This subject has been debunked a hundred times already ;) Im not going there again.
 
did it occur to anyone that the numbers could be total false?? made up? or maybe its one of their lower cards and not the best one. She mentioned " this is the radeon 6000 series which we now affectionately called big navi thanks to many of you who nicknamed it for us" but is it really big navi? maybe. we call it big navi with all the specs but we say that its big navi but what if its not.. anyway just some thoughts on something we will find out more about in 18 days..
 
Bang for buck is only an excuse for lower performing products. Zen CPU has gone from best bang for buck to being premium CPU now. If AMD had the best GPU they would be pricing them accordingly. At the end of the day nobody want to get paid minimum wages :laugh: .

I reckon RDNA2 better has 30% more perf/dollar than Ampere in rasterization, only then it makes sense overlooking its lack of finesses in other area (RT/DLSS/HW Encoding/Drivers/Reflex)

Not so sure about that. Bang/buck or just price/perf is always a factor. Otherwise every 3080 buyer would have ordered a 3090 instead, right? The fact they do not but still spend 700-800 on a GPU is directly related to bang/buck. Or are you now gonna say 3080 is midrange:roll:
 
Not so sure about that. Bang/buck or just price/perf is always a factor. Otherwise every 3080 buyer would have ordered a 3090 instead, right? The fact they do not but still spend 700-800 on a GPU is directly related to bang/buck. Or are you now gonna say 3080 is midrange:roll:

Oh don't you worry, the e-peen crowd already gobble up any 3090 stock they can find :D. Well I too would like a piece but all the vendors in my country are price gouging the hell outta 3080/3090, going as high as +400-500usd over MSRP for both, somehow that is legal in third world countries.
Not that there were few 2080 Ti buyers either, with 2080 Ti registering almost 1% in steam hardware survey.
There are many more rich kids out there than you might think :laugh:
 
I agree efikkan, AMD sucks, they will launch a mediocre GPU with bad drivers at first and probably consume a lot of power, but, they won't handicap a good card to force you to buy a new generation because they already suck and they don't care much about this market, they care about the millions of orders from Microsoft and Sony much more than the very very few "enthusiast" that might buy these high end cards, that's why first drivers will be so bad you are lucky if you boot into windows, why bother for practically no money compared to what they sell to consoles.
They are here to piss off Nvidia a bit but not really compete.

Could we lay off the "bad driver" or "power hungry" nonsense? Last I checked Nvidia cards have been ever so so power hungry and the 30 series launch and current state of those cards is something to think about before pointing out the very same flaws in the competitor.
 
If 5700 wasn't a decent release for you, neither will this one.
Black screens of death, no new features, once again marketed features that had to be disabled... How was it decent? It managed to be worse than Turing's.
 
Fury X - Flopped
Vega - Flopped
Radeon VII - Flopped
AMD did have special events annoucing all of those GPU mind you. Radeon VII in particular flopped so hard it was dead on arrival.

Well considering Navi10 is slightly faster than TU106 (2070), it is a remarkable improvement already when Navi21 is faster than GA104, I will give you that. GA102 however is a different beast.

And Navi21 is cheaper to produce than GA102 ? do you know that AMD pay 9k usd per 7nm wafer at TSMC vs Nvidia pay only 3k usd per 8N wafer at Samsung ? GA102 is dirt cheap to produce, Nvidia has like 60%+ profit margin afterall (last quarter was 66%).

Navi21 best silicon will be reserved for prosumer cards where AMD make more profit. So expect cut down Navi21 competing and beating GA104, which is still good.
You seem to be one of the main ones who can't comprehend. Do you know memory cost as well...a gpu does not consist of just a "die". According MLID who seems to know more than the both of us, Nvidia is roughly making maybe $50-60 on each 3080. Sounds like an expensive ass gpu to me. It does not and will not cost AMD $50 below msrp to make their cards. With winter approaching i guess one of the bigger justification for current Ampere is, no space heater needed.

FURY X did not flop... loud mouth aibs have word to Nvidia which allowed for Nvidia to drop its pricing to train on FuryX parade, so your argument there is a wash.

Funny how you forget 290x vs Titan fiasco... i won't mention who took that L.

Vega didn't flop it was just late... and judging by AMDs predicament at the time, Zen was the more important product(now we're about to reap the reward for Vegas sacrifice). It's also funny that Vega is as good as its Nvidia counterpart now. But if you want to call vega an L, I'll give you that.

Radeon 7 was never meant for the gaming segment, navi had a bug and was going to be delayed and they needed a "tide me over" card. And if Radeon 7 was trash then so was the 2080... at least the Radeon 7 could be used for compute.
RDNA1 (5700xt) was a good bit faster that Nvidia had to do a refresh. It did all this while being substantially smaller.

Now to present time... If RDNA2s so bad why did Nvidia rush a launch? (No stock till 2021). Why did they push out those space heaters? Xx80 chip now back on the big die. Hmm, i wonder, it seems Nvidia knows more about AMD than you or I, and they clearly jumped the gun. The 3090 is just 2 shaders shy of being the full die and it is only a mere 10% faster the 3080, all while being basically double/more the cost. That's doom and gloom in my book. If AMD wants the crown, it's right there for the taking. If i were them, I'd look to take both the CPU & GPU crown in the same month. Why did Nvidia delay the 3070 then give that bs excuse? They wanted to one up AMD the day after, if you think AMD want ready this go around, you're delusional. Too many irrational decisions by Nvidia this go around and it says one thing to me, AMD has arrived.
 
Black screens of death, no new features, once again marketed features that had to be disabled... How was it decent? It managed to be worse than Turing's.

Would not say worse than Turing as I have never found issue with my 5700. As a matter of fact my 5700 turned out to be an pretty good video card. It is my 5700XT that turned out to be utter crap as far as heat issues. The customer is not supposed to be given a crap card, to undervolt, due to its build. This applies to both Nvidia and AMD and they are both been caught with marketing lies.

Overall IMHO if you got last generation's video card. You do not need this generation's video card. I'll even go further to say that if you got any video card from 2016 and on wards, you do not need a video card. Because the tech advances that you need to play your favorite games will not be much of an increase.

And contrary to what these tech talking heads want to say. The world does not revolved around 4K. It is 1080p AND LESS.

So enjoy what you have and buy what you absolutely need to buy and save some money in the end.
 
I think Nvidia estimated correctly what AMD will launch based on what they did with the consoles and they pushed the hell out of that silicon reaching a new record of GPU power consumption.
If you now build a computer with a 10900k and RTX 3080 you can skip buying a heater in the winter, these two can actually heat up the room :).

I know this is a semi-joke but that is the first combination of components in a PC that could genuinely suffice as a heater during the winter when gaming.

500W is the same as my small office electric eco heater. The heat that a 500W PC produces can't vanish into thin air of course it just gets soaked up by your cooler fins until it's spat out into your room.
 
2080TI +21% is not bad at all, IF pricing is right. IF AMD makes 12GB and 16GB variants and price them $499 and $549, it would be blast of a launch. On the other hand IF AMD decides to be greedy like with Zen3 announcement and price them $649 and $699, then it's gonna be DOA. Remember Radeon 7 debacle, competing with 1080TI 2 years later and charging $700? I'm afraid AMD will shoot itself in the foot once more and go with the suicidal greed option. I can still remember Lisa trying to sell us Polaris replacement (Navi 1) for 499 first and then for 449. Greed is infectious as hell:
View attachment 171323

Are you talking about that time AMD jebaited Nvidia and then lowered their price? That's not greed, it's AMD playing Nvidia for once. Not to mention, it's a $50 price difference. You want to talk about greed, let's talk about the $500 price increase from the 1080 Ti to the 2080 Ti, Nvidia's past staggered launches like the "flagship" 980 only to release into the 980 Ti, or the GeForce Partner Program.

I'm not for any price increases but it's hilarious the double standard being applied. $50 makes AMD greedy fucks, $500 doesn't even register for Nvidia apparently though.
 
I guess the dilemma will be like this:

3080: Faster, Raytracing performance, DLSS, CUDA, (theory or fact you decide) better drivers

Big Navi: Lower price, 16GB of VRAM, FreeSync (if I am not mistaken NOT all FreeSync monitors are supported by Nvidia).

PS. Su called the card "Big Navi", so probably the top model. People gave that nick name to the big model, not some second mid range model.

There are multiple SKUs on the big navi chip. It's already been long rumored with model numbers in drivers that there are few SKUs for Navi 21 chip so this is Big Navi just not the biggest Navi. If they have bigger they will probably. It could be either way. I wouldn't be surprised though if this is one right below the top sku. 72CU.

Fury X - Flopped
Vega - Flopped
Radeon VII - Flopped
AMD did have special events annoucing all of those GPU mind you. Radeon VII in particular flopped so hard it was dead on arrival.

Well considering Navi10 is slightly faster than TU106 (2070), it is a remarkable improvement already when Navi21 is faster than GA104, I will give you that. GA102 however is a different beast.

And Navi21 is cheaper to produce than GA102 ? do you know that AMD pay 9k usd per 7nm wafer at TSMC vs Nvidia pay only 3k usd per 8N wafer at Samsung ? GA102 is dirt cheap to produce, Nvidia has like 60%+ profit margin afterall (last quarter was 66%).

Navi21 best silicon will be reserved for prosumer cards where AMD make more profit. So expect cut down Navi21 competing and beating GA104, which is still good.

AMD did not have a special event for Radeon VII. It was at the tail end of other event and no one expected it. You sound very sure like you are Lisa Su lol.

I think they get enough information and have enough insight to make a pretty educated guess.

Yea they must have some guess thats why they made a power hungry card with no OC headroom. They probably had an idea what AMD is targeting but not exact. Videocardz, redgamingtech, not an apple fan, Moore's Law is dead basically all have confirmed there is a TOP card that no one absolutely no AIB has a wind of. AMD is keeping the top chip in house reference only until launch. So no one knows about the biggest chip that will be launched because AMD has kept it so close to chest.

Fury X - Flopped
Vega - Flopped
Radeon VII - Flopped
AMD did have special events annoucing all of those GPU mind you. Radeon VII in particular flopped so hard it was dead on arrival.

Well considering Navi10 is slightly faster than TU106 (2070), it is a remarkable improvement already when Navi21 is faster than GA104, I will give you that. GA102 however is a different beast.

And Navi21 is cheaper to produce than GA102 ? do you know that AMD pay 9k usd per 7nm wafer at TSMC vs Nvidia pay only 3k usd per 8N wafer at Samsung ? GA102 is dirt cheap to produce, Nvidia has like 60%+ profit margin afterall (last quarter was 66%).

Navi21 best silicon will be reserved for prosumer cards where AMD make more profit. So expect cut down Navi21 competing and beating GA104, which is still good.

Yield my friend. Let's say your numbers are 100% for how much it costs. That 9k makes more chips. Seriously your math is ignorant to everything thats true. AMD will pump more chips even they paid more per wafer. Look at Nvidia struggling to make chips. Not FE model will not be sold at Nvidia store lol. Rumor was true that Nvidia wont be making too many of those and will let AIBs sell the cards at higher price since its cost too much to make the FE model. Now all that is coming true.

GA 102 is not dirt cheap to produce when the chips are so damn big. Common now lol. You are assuming that there is 100% Yield and Nvidia is fitting more chips on the wafer then AMD. common now, no matter how you cook it AMD will make more chips on a smaller node and be cheaper to make overall.

oh by the way. Nvidia is not paying 3k per wafer. Its more like close to 6k. So yea they are making profit sure, but not so much with the FE model it seems since they are only selling it through best buy in the U.S now and rumors are coming true that they don't want to make too many of those.

 
Last edited:
The best thing AMD can do is release their video cards and take away 3080 sales while Nvidia has no available stock. Going off of AMDs history I shouldn't have high hopes for Big Navi but I'm always hoping either AMD/Nvidia or Intel has something great coming out. That's how unbiased enthusiasts are, us guys who go way back, back when Al Gore invented the internet :D
 
Back
Top