Friday, October 9th 2020

AMD Big Navi Performance Claims Compared to TPU's Own Benchmark Numbers of Comparable GPUs

AMD in its October 8 online launch event for the Ryzen 5000 "Zen 3" processors, provided a teaser of the company's next flagship graphics card slotted in the Radeon RX 6000 series. This particular SKU has been referred to by company CEO Lisa Su as "Big Navi," meaning it could be the top part from AMD's upcoming client GPU lineup. As part of the teaser, Su held up the reference design card, and provided three performance numbers of the card as tested on a machine powered by a Ryzen 9 5900X "Zen 3" processor. We compared these performance numbers, obtained at 4K UHD, with our own testing data for the games, to see how the card compares to other current-gen cards in its class. Our testing data for one of the games is from the latest RTX 30-series reviews, find details of our test bed here. We obviously have a different CPU since the 5900X is unreleased, but use the highest presets in our testing.

With "Borderlands 3" at 4K, with "badass" performance preset and DirectX 12 renderer, AMD claims a frame-rate of 61 FPS. We tested the game with its DirectX 12 renderer in our dedicated performance review (test bed details here). AMD's claimed performance ends up 45.9 percent higher than that of the GeForce RTX 2080 Ti as tested by us, which yields 41.8 FPS on our test bed. The RTX 3080 ends up 15.24 percent faster than Big Navi, with 70.3 FPS. It's important to note here that AMD may be using a different/lighter test scene than us, since we don't use internal benchmark tools of games, and design our own test scenes. It's also important to note that we tested Borderlands 3 with DirectX 12 only in the game's launch-day review, and use the DirectX 11 renderer in our regular VGA reviews.
With Gears 5, AMD claims performance of 73 FPS at 4K, with its DirectX 12 renderer, with the "Ultra" preset. This number ends up 16.24 percent faster than that of the RTX 2080 Ti, which scores 62.8 FPS on our test bed. The RTX 3080 is 15.61 percent faster than the AMD card, at 84.4 FPS.

Call of Duty: Modern Warfare was never added to our VGA review test selection, but we tested the game separately at launch (find its test bed information here). In this testing, we found the RTX 2080 Ti to score 77.9 FPS at Ultra Settings, with RTX-off. In comparison, AMD claims Big Navi scores 88 FPS, making it 12.96 percent faster.

We know this is a very coarse and unscientific way to compare AMD's numbers to ours, and AMD has probably cherry-picked games that are most optimized to its GPUs, but it lends plausibility to a theory that Big Navi may end up comparable to the RTX 2080 Ti, and trade blows with the upcoming RTX 3070, which NVIDIA claims outperforms the RTX 2080 Ti. The RX 6000 "Big Navi" ends up with a geometric mean of 21% higher frame-rates than the RTX 2080 Ti in these three tests, which would imply almost double the performance of the RX 5700 XT. Big Navi is rumored to feature double the CUs as the the RX 5700 XT, so the claims somewhat line up.
Add your own comment

262 Comments on AMD Big Navi Performance Claims Compared to TPU's Own Benchmark Numbers of Comparable GPUs

#126
medi01
GoldenX
A decent Radeon release after so long?
If 5700 wasn't a decent release for you, neither will this one.
Posted on Reply
#127
springs113
So much ignorance in this thread, one thing that i loved though, is all the discussion was civil as possible. Now, if anyone here understands marketing then you know this... that was not AMDs best foot forward. I'd be worried if I'm Nvidia(more so if that's a 64CU card). People love to talk about track record(especially at AMD haven't competed in the for years) , but fail to realize AMDs recent track record they "jebaited" several times recently, Zen 2, 5700xt, 5600xt, technically Zen3.

Whether we agree or disagree Lisa Su and Co have shown that they can deliver on their promises and I for one clearly think that AMD would be considered morons to have a separate event to release a "flop" of a gpu. It's clear to me and a select few that they've got something special on their hands and are being real coy about it. Do you think the teaser at the end wasn't on purpose? They've been doing it all along. Look at the rumor mill when it came to Zen 3 vs RDNA2. It was clear to me what was prioritized and where. While Zen 3 is a big deal, RDNA2 is where the statement needed to made.

IIRC no one here mentioned that these cards are cheaper to make on 7nm overall vs Ampere. These cards will be very efficient and priced well. AMD also is doing what very few companies have done, be honest. These benchmarks clearly shows AMD used games they struggle with (both CPU & GPU). Seems to me like they are showing worse case scenarios to gauge the internets reactions and it's working. The GPU event was obviously last for a reason, they have the fastest CPUs in town and thus needed the GPUs to be tested with the best. This also shows their ecosystem and lastly to surprise all the naysayers who can't seem to understand that they can compete. RDNA1 matched Turings IPC and Nvidia regressed in that department with Ampere.

A "halo" product ain't a damn 3070 competitor.
Why is that so hard for fanboys to understand?
Posted on Reply
#128
nguyen
springs113
Whether we agree or disagree Lisa Su and Co have shown that they can deliver on their promises and I for one clearly think that AMD would be considered morons to have a separate event to release a "flop" of a gpu. It's clear to me and a select few that they've got something special on their hands and are being real coy about it. Do you think the teaser at the end wasn't on purpose? They've been doing it all along. Look at the rumor mill when it came to Zen 3 vs RDNA2. It was clear to me what was prioritized and where. While Zen 3 is a big deal, RDNA2 is where the statement needed to made.

IIRC no one here mentioned that these cards are cheaper to make on 7nm overall vs Ampere. These cards will be very efficient and priced well. AMD also is doing what very few companies have done, be honest. These benchmarks clearly shows AMD used games they struggle with (both CPU & GPU). Seems to me like they are showing worse case scenarios to gauge the internets reactions and it's working. The GPU event was obviously last for a reason, they have the fastest CPUs in town and thus needed the GPUs to be tested with the best. This also shows their ecosystem and lastly to surprise all the naysayers who can't seem to understand that they can compete. RDNA1 matched Turings IPC and Nvidia regressed in that department with Ampere.

A "halo" product ain't a damn 3070 competitor.
Why is that so hard for fanboys to understand?
Fury X - Flopped
Vega - Flopped
Radeon VII - Flopped
AMD did have special events annoucing all of those GPU mind you. Radeon VII in particular flopped so hard it was dead on arrival.

Well considering Navi10 is slightly faster than TU106 (2070), it is a remarkable improvement already when Navi21 is faster than GA104, I will give you that. GA102 however is a different beast.

And Navi21 is cheaper to produce than GA102 ? do you know that AMD pay 9k usd per 7nm wafer at TSMC vs Nvidia pay only 3k usd per 8N wafer at Samsung ? GA102 is dirt cheap to produce, Nvidia has like 60%+ profit margin afterall (last quarter was 66%).

Navi21 best silicon will be reserved for prosumer cards where AMD make more profit. So expect cut down Navi21 competing and beating GA104, which is still good.
Posted on Reply
#129
Dyatlov A
Will it have good Vulkan driver for Red Dead Redemption 2 or will freeze like the previous generation?
Posted on Reply
#130
Luminescent
Vayra86
No. Just:.. no, sorry. You can leave this drivel elsewhere.
I understand, some people can't comprehend they can do that, handicap cards from drivers, neah, impossible, corporations operating as a monopoly when they don't have any competition, this is nonsense.
While some people here can't get around this, the majority don't see a reason to ever upgrade beyond a GTX 1060 or something equivalent from Amd until it's dead, lack of good games makes PC so unattractive and the rare good ones are so well made they don't need RTX 3080 or whatever Amd makes, just look at recent star wars squadrons or previous ones like jedi fallen order, runs fine on gtx 1060 and equivalent AMD, big publishers like EA or blizzard.... they don't care about GPU sales, it really hurts Nvidia when they put something like that on the market, imagine gaming on a 4 year old card with all settings turned up and graphics are top notch :laugh:
Posted on Reply
#131
Mussels
Moderprator
If they get stock before my 3080 eventually arrives (late NOVEMBER! FOR A PREORDER!) i'll just damn well switch to team red again.
Posted on Reply
#132
okbuddy
6900xt about 2% faster than 3080, and $50 cheaper

3080 oc is exactly 200% 5700xt
6900xt is double 5700xt and plus
Posted on Reply
#133
Vayra86
Luminescent
I understand, some people can't comprehend they can do that, handicap cards from drivers, neah, impossible, corporations operating as a monopoly when they don't have any competition, this is nonsense.
While some people here can't get around this, the majority don't see a reason to ever upgrade beyond a GTX 1060 or something equivalent from Amd until it's dead, lack of good games makes PC so unattractive and the rare good ones are so well made they don't need RTX 3080 or whatever Amd makes, just look at recent star wars squadrons or previous ones like jedi fallen order, runs fine on gtx 1060 and equivalent AMD, big publishers like EA or blizzard.... they don't care about GPU sales, it really hurts Nvidia when they put something like that on the market, imagine gaming on a 4 year old card with all settings turned up and graphics are top notch :laugh:
Did it occur to you I never even asked you to provide a source for these old and tired claims? And that you never provided one either?

This is because I have already seen them all and none of them holds any water whatsoever. You can spare yourself the trouble. And again.. you can either learn a thing or two here or you can live in fantasy land, but again Id say, do it elsewhere. Ill torch any nonsense I see and so far you've been full of it.

Consider carefully what you might post next. This subject has been debunked a hundred times already ;) Im not going there again.
Posted on Reply
#134
dinmaster
did it occur to anyone that the numbers could be total false?? made up? or maybe its one of their lower cards and not the best one. She mentioned " this is the radeon 6000 series which we now affectionately called big navi thanks to many of you who nicknamed it for us" but is it really big navi? maybe. we call it big navi with all the specs but we say that its big navi but what if its not.. anyway just some thoughts on something we will find out more about in 18 days..
Posted on Reply
#135
Vayra86
nguyen
Bang for buck is only an excuse for lower performing products. Zen CPU has gone from best bang for buck to being premium CPU now. If AMD had the best GPU they would be pricing them accordingly. At the end of the day nobody want to get paid minimum wages :laugh: .

I reckon RDNA2 better has 30% more perf/dollar than Ampere in rasterization, only then it makes sense overlooking its lack of finesses in other area (RT/DLSS/HW Encoding/Drivers/Reflex)
Not so sure about that. Bang/buck or just price/perf is always a factor. Otherwise every 3080 buyer would have ordered a 3090 instead, right? The fact they do not but still spend 700-800 on a GPU is directly related to bang/buck. Or are you now gonna say 3080 is midrange:roll:
Posted on Reply
#136
nguyen
Vayra86
Not so sure about that. Bang/buck or just price/perf is always a factor. Otherwise every 3080 buyer would have ordered a 3090 instead, right? The fact they do not but still spend 700-800 on a GPU is directly related to bang/buck. Or are you now gonna say 3080 is midrange:roll:
Oh don't you worry, the e-peen crowd already gobble up any 3090 stock they can find :D. Well I too would like a piece but all the vendors in my country are price gouging the hell outta 3080/3090, going as high as +400-500usd over MSRP for both, somehow that is legal in third world countries.
Not that there were few 2080 Ti buyers either, with 2080 Ti registering almost 1% in steam hardware survey.
There are many more rich kids out there than you might think :laugh:
Posted on Reply
#137
kingDR
jesdals
I could live with that if the price is belowe the 3080
Agree.
Posted on Reply
#138
DuxCro
lynx29
I just checked on AMD's bicycles. They are sold out as well, lmao. wow. just wow.
Probably because people think AMD bicycles have IPC (ImProved Cycling) over other bycicles.
Posted on Reply
#139
Totally
Luminescent
I agree efikkan, AMD sucks, they will launch a mediocre GPU with bad drivers at first and probably consume a lot of power, but, they won't handicap a good card to force you to buy a new generation because they already suck and they don't care much about this market, they care about the millions of orders from Microsoft and Sony much more than the very very few "enthusiast" that might buy these high end cards, that's why first drivers will be so bad you are lucky if you boot into windows, why bother for practically no money compared to what they sell to consoles.
They are here to piss off Nvidia a bit but not really compete.
Could we lay off the "bad driver" or "power hungry" nonsense? Last I checked Nvidia cards have been ever so so power hungry and the 30 series launch and current state of those cards is something to think about before pointing out the very same flaws in the competitor.
Posted on Reply
#140
GoldenX
medi01
If 5700 wasn't a decent release for you, neither will this one.
Black screens of death, no new features, once again marketed features that had to be disabled... How was it decent? It managed to be worse than Turing's.
Posted on Reply
#141
springs113
nguyen
Fury X - Flopped
Vega - Flopped
Radeon VII - Flopped
AMD did have special events annoucing all of those GPU mind you. Radeon VII in particular flopped so hard it was dead on arrival.

Well considering Navi10 is slightly faster than TU106 (2070), it is a remarkable improvement already when Navi21 is faster than GA104, I will give you that. GA102 however is a different beast.

And Navi21 is cheaper to produce than GA102 ? do you know that AMD pay 9k usd per 7nm wafer at TSMC vs Nvidia pay only 3k usd per 8N wafer at Samsung ? GA102 is dirt cheap to produce, Nvidia has like 60%+ profit margin afterall (last quarter was 66%).

Navi21 best silicon will be reserved for prosumer cards where AMD make more profit. So expect cut down Navi21 competing and beating GA104, which is still good.
You seem to be one of the main ones who can't comprehend. Do you know memory cost as well...a gpu does not consist of just a "die". According MLID who seems to know more than the both of us, Nvidia is roughly making maybe $50-60 on each 3080. Sounds like an expensive ass gpu to me. It does not and will not cost AMD $50 below msrp to make their cards. With winter approaching i guess one of the bigger justification for current Ampere is, no space heater needed.

FURY X did not flop... loud mouth aibs have word to Nvidia which allowed for Nvidia to drop its pricing to train on FuryX parade, so your argument there is a wash.

Funny how you forget 290x vs Titan fiasco... i won't mention who took that L.

Vega didn't flop it was just late... and judging by AMDs predicament at the time, Zen was the more important product(now we're about to reap the reward for Vegas sacrifice). It's also funny that Vega is as good as its Nvidia counterpart now. But if you want to call vega an L, I'll give you that.

Radeon 7 was never meant for the gaming segment, navi had a bug and was going to be delayed and they needed a "tide me over" card. And if Radeon 7 was trash then so was the 2080... at least the Radeon 7 could be used for compute.
RDNA1 (5700xt) was a good bit faster that Nvidia had to do a refresh. It did all this while being substantially smaller.

Now to present time... If RDNA2s so bad why did Nvidia rush a launch? (No stock till 2021). Why did they push out those space heaters? Xx80 chip now back on the big die. Hmm, i wonder, it seems Nvidia knows more about AMD than you or I, and they clearly jumped the gun. The 3090 is just 2 shaders shy of being the full die and it is only a mere 10% faster the 3080, all while being basically double/more the cost. That's doom and gloom in my book. If AMD wants the crown, it's right there for the taking. If i were them, I'd look to take both the CPU & GPU crown in the same month. Why did Nvidia delay the 3070 then give that bs excuse? They wanted to one up AMD the day after, if you think AMD want ready this go around, you're delusional. Too many irrational decisions by Nvidia this go around and it says one thing to me, AMD has arrived.
Posted on Reply
#142
Icon Charlie
GoldenX
Black screens of death, no new features, once again marketed features that had to be disabled... How was it decent? It managed to be worse than Turing's.
Would not say worse than Turing as I have never found issue with my 5700. As a matter of fact my 5700 turned out to be an pretty good video card. It is my 5700XT that turned out to be utter crap as far as heat issues. The customer is not supposed to be given a crap card, to undervolt, due to its build. This applies to both Nvidia and AMD and they are both been caught with marketing lies.

Overall IMHO if you got last generation's video card. You do not need this generation's video card. I'll even go further to say that if you got any video card from 2016 and on wards, you do not need a video card. Because the tech advances that you need to play your favorite games will not be much of an increase.

And contrary to what these tech talking heads want to say. The world does not revolved around 4K. It is 1080p AND LESS.

So enjoy what you have and buy what you absolutely need to buy and save some money in the end.
Posted on Reply
#143
Shatun_Bear
Luminescent
I think Nvidia estimated correctly what AMD will launch based on what they did with the consoles and they pushed the hell out of that silicon reaching a new record of GPU power consumption.
If you now build a computer with a 10900k and RTX 3080 you can skip buying a heater in the winter, these two can actually heat up the room :).
I know this is a semi-joke but that is the first combination of components in a PC that could genuinely suffice as a heater during the winter when gaming.

500W is the same as my small office electric eco heater. The heat that a 500W PC produces can't vanish into thin air of course it just gets soaked up by your cooler fins until it's spat out into your room.
Posted on Reply
#144
evernessince
RedelZaVedno
2080TI +21% is not bad at all, IF pricing is right. IF AMD makes 12GB and 16GB variants and price them $499 and $549, it would be blast of a launch. On the other hand IF AMD decides to be greedy like with Zen3 announcement and price them $649 and $699, then it's gonna be DOA. Remember Radeon 7 debacle, competing with 1080TI 2 years later and charging $700? I'm afraid AMD will shoot itself in the foot once more and go with the suicidal greed option. I can still remember Lisa trying to sell us Polaris replacement (Navi 1) for 499 first and then for 449. Greed is infectious as hell:

Are you talking about that time AMD jebaited Nvidia and then lowered their price? That's not greed, it's AMD playing Nvidia for once. Not to mention, it's a $50 price difference. You want to talk about greed, let's talk about the $500 price increase from the 1080 Ti to the 2080 Ti, Nvidia's past staggered launches like the "flagship" 980 only to release into the 980 Ti, or the GeForce Partner Program.

I'm not for any price increases but it's hilarious the double standard being applied. $50 makes AMD greedy fucks, $500 doesn't even register for Nvidia apparently though.
Posted on Reply
#145
Blueberries
TIL AMD fanboys actually believe the "jebait" happened
Posted on Reply
#146
Nkd
john_
I guess the dilemma will be like this:

3080: Faster, Raytracing performance, DLSS, CUDA, (theory or fact you decide) better drivers

Big Navi: Lower price, 16GB of VRAM, FreeSync (if I am not mistaken NOT all FreeSync monitors are supported by Nvidia).

PS. Su called the card "Big Navi", so probably the top model. People gave that nick name to the big model, not some second mid range model.
There are multiple SKUs on the big navi chip. It's already been long rumored with model numbers in drivers that there are few SKUs for Navi 21 chip so this is Big Navi just not the biggest Navi. If they have bigger they will probably. It could be either way. I wouldn't be surprised though if this is one right below the top sku. 72CU.
nguyen
Fury X - Flopped
Vega - Flopped
Radeon VII - Flopped
AMD did have special events annoucing all of those GPU mind you. Radeon VII in particular flopped so hard it was dead on arrival.

Well considering Navi10 is slightly faster than TU106 (2070), it is a remarkable improvement already when Navi21 is faster than GA104, I will give you that. GA102 however is a different beast.

And Navi21 is cheaper to produce than GA102 ? do you know that AMD pay 9k usd per 7nm wafer at TSMC vs Nvidia pay only 3k usd per 8N wafer at Samsung ? GA102 is dirt cheap to produce, Nvidia has like 60%+ profit margin afterall (last quarter was 66%).

Navi21 best silicon will be reserved for prosumer cards where AMD make more profit. So expect cut down Navi21 competing and beating GA104, which is still good.
AMD did not have a special event for Radeon VII. It was at the tail end of other event and no one expected it. You sound very sure like you are Lisa Su lol.
Vayra86
I think they get enough information and have enough insight to make a pretty educated guess.
Yea they must have some guess thats why they made a power hungry card with no OC headroom. They probably had an idea what AMD is targeting but not exact. Videocardz, redgamingtech, not an apple fan, Moore's Law is dead basically all have confirmed there is a TOP card that no one absolutely no AIB has a wind of. AMD is keeping the top chip in house reference only until launch. So no one knows about the biggest chip that will be launched because AMD has kept it so close to chest.
nguyen
Fury X - Flopped
Vega - Flopped
Radeon VII - Flopped
AMD did have special events annoucing all of those GPU mind you. Radeon VII in particular flopped so hard it was dead on arrival.

Well considering Navi10 is slightly faster than TU106 (2070), it is a remarkable improvement already when Navi21 is faster than GA104, I will give you that. GA102 however is a different beast.

And Navi21 is cheaper to produce than GA102 ? do you know that AMD pay 9k usd per 7nm wafer at TSMC vs Nvidia pay only 3k usd per 8N wafer at Samsung ? GA102 is dirt cheap to produce, Nvidia has like 60%+ profit margin afterall (last quarter was 66%).

Navi21 best silicon will be reserved for prosumer cards where AMD make more profit. So expect cut down Navi21 competing and beating GA104, which is still good.
Yield my friend. Let's say your numbers are 100% for how much it costs. That 9k makes more chips. Seriously your math is ignorant to everything thats true. AMD will pump more chips even they paid more per wafer. Look at Nvidia struggling to make chips. Not FE model will not be sold at Nvidia store lol. Rumor was true that Nvidia wont be making too many of those and will let AIBs sell the cards at higher price since its cost too much to make the FE model. Now all that is coming true.

GA 102 is not dirt cheap to produce when the chips are so damn big. Common now lol. You are assuming that there is 100% Yield and Nvidia is fitting more chips on the wafer then AMD. common now, no matter how you cook it AMD will make more chips on a smaller node and be cheaper to make overall.

oh by the way. Nvidia is not paying 3k per wafer. Its more like close to 6k. So yea they are making profit sure, but not so much with the FE model it seems since they are only selling it through best buy in the U.S now and rumors are coming true that they don't want to make too many of those.

itigic.com/why-did-nvidia-rtx-3000-arrive-in-8nm-with-samsung/
Posted on Reply
#147
Th3pwn3r
The best thing AMD can do is release their video cards and take away 3080 sales while Nvidia has no available stock. Going off of AMDs history I shouldn't have high hopes for Big Navi but I'm always hoping either AMD/Nvidia or Intel has something great coming out. That's how unbiased enthusiasts are, us guys who go way back, back when Al Gore invented the internet :D
Posted on Reply
#148
evernessince
nguyen
Fury X - Flopped
Vega - Flopped
Radeon VII - Flopped
AMD did have special events annoucing all of those GPU mind you. Radeon VII in particular flopped so hard it was dead on arrival.

Well considering Navi10 is slightly faster than TU106 (2070), it is a remarkable improvement already when Navi21 is faster than GA104, I will give you that. GA102 however is a different beast.

And Navi21 is cheaper to produce than GA102 ? do you know that AMD pay 9k usd per 7nm wafer at TSMC vs Nvidia pay only 3k usd per 8N wafer at Samsung ? GA102 is dirt cheap to produce, Nvidia has like 60%+ profit margin afterall (last quarter was 66%).

Navi21 best silicon will be reserved for prosumer cards where AMD make more profit. So expect cut down Navi21 competing and beating GA104, which is still good.
None of those GPUs were really designed for gaming though and the Fury X was made because it's the only thing the graphics department could put together under rory read, who wanted to move AMD away from high end graphics all together.

Vega and Radeon VII are based on the Vega architecture which is clearly focused on professional and compute. They have have a large part of the die dedicated to the HBCC, which does nothing for gaming. The only upside to the VII cards was that they still sell for $700 because they do very well in professional workloads.

In essence "Big Navi" will be AMD's return to high end gaming graphics cards in a LONG time.
GoldenX
Black screens of death, no new features, once again marketed features that had to be disabled... How was it decent? It managed to be worse than Turing's.
I suggest you read up on Navi before saying it has no new features: www.techpowerup.com/256660/amd-e3-2019-tech-day-all-slide-decks-ryzen-3000-zen-2-radeon-rx-5000-navi-etc

Among the features is localized L1 cache sharing. Sharders in a group can share content among the L1 caches. This not only increases effective bandwidth and cache size (by removing the need for duplicated data) but reduces latency and cache misses. I would not be surprised if RDNA2 enables sharing across shader groups. Anything that spares the GPU from having to access the L2 cache, which is significantly slower to access than L1.

Given the boost in performance Navi has over prior AMD GPUs and that it brought AMD on par efficiency wise with Nvidia, I would certainly not call it a turing. Just because AMD only released the mid range certainly doesn't make it bad.

Also, what is your source for marketed features that had to be disabled? That was only for Vega as far as I'm aware as I've looked over all the marketing slides AMD have provided.
Posted on Reply
#149
GoldenX
I'm not totally sure if architecture changes can count as features. Sure that's the secret behind the performance uplift, but AMD still lacks variable rate shaders, mesh shaders and ray tracing.
Primitive shaders is also in hardware in RDNA1, same as Vega, was never enabled. Imagine developing a whole new stage in the graphics pipeline, and never using it.
Posted on Reply
#150
beedoo
Blueberries
TIL AMD fanboys actually believe the "jebait" happened
Perhaps you're young, after all you did just use 'TIL' - which after Googling it, I've just learned means 'Today I learned'. Perhaps you're very insightful, or perhaps naive...

Either way, you'd have to be living under a rock to miss the mind-games that are going on in the industry right now. As such you can choose to believe that Lisa Su just chucked her hand in and said this is all we've got - at an event that was about CPU launches, or you can be open to the possibility that AMD is playing the game, and leaving something in the tank for the main GPU launch event in a few weeks.

If they had nothing to challenge Nvidia, I believe they would have teased a 'wait and see' line. Of course, there is also the possibility that AMD are softening the blow.

In my job, which involves being very analytical, I would fail dismally every time if I made rash decisions without considering all possibilities. Then again, I trust my gut instinct.

I'm not saying you're wrong in terms of the eventual outcome, but I am saying you're wrong to mock and be definitive in your opinion. Time will tell.
Posted on Reply
Add your own comment