Thursday, January 10th 2019

NVIDIA CEO Jensen Huang on Radeon VII: "Underwhelming (...) the Performance is Lousy"; "Freesync Doesn't Work"

PC World managed to get a hold of NVIDIA CEO Jensen Huang, picking his thoughts on AMD's recently announced Radeon VII. Skirting through the usual amicable, politically correct answers, Jensen made his thoughts clear on what the competition is offering to compete with NVIDIA's RTX 2000 series. The answer? Vega VII is an "underwhelming product", because "The performance is lousy and there's nothing new. [There's] no ray tracing, no AI. It's 7nm with HBM memory that barely keeps up with a 2080. And if we turn on DLSS we'll crush it. And if we turn on ray tracing we'll crush it." Not content on dissing the competition's product, Jensen Huang also quipped regarding AMD's presentation and product strategy, saying that "It's a weird launch, maybe they thought of it this morning."
Of course, the real market penetration of the technologies Jensen Huang mentions is currently extremely low - only a handful of games support NVIDIA's forward-looking ray tracing technologies. That AMD chose to not significantly invest resources and die-space for what is essentially a stop-gap high-performance card to go against NVIDIA's RTX 2080 means its 7 nm 331 mm² GPU will compete against NVIDIA's 12 nm, 545 mm² die - if performance estimates are correct, of course.
The next remarks came regarding AMD's FreeSync (essentially a name for VESA's Adaptive Sync), which NVIDIA finally decided to support on its GeForce graphics cards - something the company could have done outright, instead of deciding to go the proprietary, module-added, cost-increased route of G-Sync. While most see this as a sign that NVIDIA has seen a market slowdown for its G-Sync, added price-premium monitors and that they're just ceding to market demands, Huang sees it another way, saying that "We never competed. [FreeSync] was never proven to work. As you know, we invented the area of adaptive sync. The truth is most of the FreeSync monitors do not work. They do not even work with AMD's graphics cards." In the wake of these word from Jensen, it's hard to understand the overall silence from users that might have their FreeSync monitors not working.

Reportedly, NVIDIA only found 12 out of 400 FreeSync-supporting monitors to support their G-Sync technology automatically in the initial battery of tests, with most panels requiring a manual override to enable the technology. Huang promised that "We will test every single card against every single monitor against every single game and if it doesn't work, we will say it doesn't work. And if it does, we will let it work," adding a snarky punchline to this matter with an "We believe that you have to test it to promise that it works, and unsurprisingly most of them don't work." Fun times. Source: PC World
Add your own comment

270 Comments on NVIDIA CEO Jensen Huang on Radeon VII: "Underwhelming (...) the Performance is Lousy"; "Freesync Doesn't Work"

#202
B-Real
The most important thing is your RTX series is lousy in terms of performance jump and especially in price. After these comments some more stock decrease is inevitable. :D

Fluffmeister, post: 3974221, member: 101373"
No doubt, with worse performance per watt than Pascal despite being built on 7nm tech he is wondering when the competiton will turn up.
I didn't know the most important thing is performance per Watt. LOL. :D Shame on you.

Anymal, post: 3973811, member: 158578"
A lot of people here who do not understand rt and dlss. Yeah, rtx line is all fail, VII is the best ever, AMD power FTW
HAHA. Whine little one, whine. What you can do is use RT in ONE game in 4 months time. And the effect of RT was officially confirmed to have been lowered for somewhat better performance, which meant ~70 fps in FHD and ~30 fps average in 4K (with one of the best optimized engines) with a $1000 (1200) card. After the patch, the effects were further lowered in order to get better results. Still, its 50-55 average in 4K with a $1000 (1200) card. :D Tomb Raider will get it with FHD 30ish fps on 2080Ti. FFXV RT was cancelled. Now these are shame.
Posted on Reply
#203
Fluffmeister
I'd dredge up some of your previous hype posts, but I can't be bothered anymore.

Looks like your about to enjoy 1080 Ti performance two years later for... *drum roll* $699. If that makes you happy i guess the quote of the day is: "shame on you".
Posted on Reply
#204
qubit
Overclocked quantum bit
It was obvious from the moment that AMD made an open standard version of G-SYNC that it would eventually beat G-SYNC into submission. AMD invited NVIDIA to support it from day one too, but NVIDIA arrogantly didn't want to, but now they're forced to. No wonder NVIDIA is pissed.

AFAIK, G-SYNC is the technically superior solution, but don't quote me.

He's made some pretty scandalous claims there about FreeSync on most of those monitors not working even with AMD cards, that sounds pretty suspicious. If it was true, enthusiasts would have been shouting about it on forums long ago and reviewers might have noticed a little bit, too. As long as he doesn't name names though they probably can't touch him.

It sounds like those non-functioning monitors simply don't work with NVIDIA cards, but he doesn't wanna admit that because it would look really bad for NVIDIA, so he includes AMD as well. So underhand, but NVIDIA's got form for this so I'm not surprised. And I'm saying this as a hardcore longtime NVIDIA user, so I'm not just being biased.
Posted on Reply
#205
mindbomb
qubit, post: 3974261, member: 46003"
He's made some pretty scandalous claims there about FreeSync on most of those monitors not working even with AMD cards, that sounds pretty suspicious. If it was true, enthusiasts would have been shouting about it on forums long ago and reviewers might have noticed a little bit, too. As long as he doesn't name names though they probably can't touch him.
The scary thing is that I don't think anyone would notice if many freesync monitors were performing poorly. Enthusiasts are relying heavily on anecdotal information, and reviewers only sporadically cover monitor releases, and often only check performance with a fixed refresh rate only.
Posted on Reply
#206
Super XP
A mindless comment by the Nvidia Boss. Do I hear arrogance in his voice or is he threatened by AMD's upcoming arsenal.
Posted on Reply
#207
PooPipeBoy
Oh great, now Jensen himself has jumped into the Nvidia Vs AMD poo-throwing arguments.
He could've made a constructive (or even neutral) comment, but no...
Posted on Reply
#208
sergionography
I mean to be fair, rtx2080 performance but with 300w tdp on 7nm is indeed very underwhelming. Rtx2080 is on 12nm and has a 225TDP, so that part of his statement I actually agree with. The freesync statement however is just petty
Posted on Reply
#209
Midland Dog
human_error, post: 3973743, member: 61722"
So does he think that the 2080 has underwhelming performance? How about the 2070 or 2060 which are even slower?

The Vega VII cards aren't setting a new benchmark in performance but if they perform at the same level as the 2080 I wouldn't say that they are underwhelming at all.

Of course what do people expect the CEO of AMD's competitor to say "oh yes they're decent cards"?
2070 and 2060 aint 799
Posted on Reply
#210
xtreemchaos
hes spat the dummy out poor guy. AMD talk of love and oneness for the gamer and nvidia just bitch. i am glad to say ive never brought anything from nvidia and that guy says why. charl.
Posted on Reply
#211
steen
Anymal, post: 3974134, member: 158578"
No halving thank to dlss
Except for the resolution... ;)

Had to say it. Thanks for the entertainment guys. It's just PR anyway, Jen-hsun has always been hands on & good on him. No question AMD dropped the ball with RTG & we all know the reasons (mostly). Just shows how forward thinking a concurrent gaphics/compute uarch it was in 2012. I don't know what people expect from Navi, but it too is GCN based. Vega20 at last gives us 128 ROPs, but doesn't break the 4 tris/clock front end given the primitive discard engine bypassing GS->HS->DS->VS is a no-show. Perhaps fixed in Navi? Nvidia has turned Maxwell->Pascal->Turing into a better balanced gaming uarch esp given substantial devrel programs.
Posted on Reply
#212
B-Real
Vayra86, post: 3973770, member: 152404"
The issue is Nvidia just lost their domination on the high end, when they needed it the most to push their Turing affair, which is the worst deal in GPU history. They now only have one top-end card that is completely priced out of the market - out of pure necessity because its too flippin' large - to eclipse AMD's ancient GCN with some lazy tweaks and a shrink. So that, alongside them having to give up the Gsync cash cow because it had nowhere else to go, RTX not really gaining traction in hearts & minds or dev products, mining being over, and their stock price stabilizing at half what it was a few months back.

Yeah, I'd be grumpy too :)
Actually with the 10 series and Vega, it was the same, as the Vega 64 brought 1080 performance. However, it was more expensive for a long time (except for the very first days and the period after the mining craze) and it was released 1 year and 3 months later than the 1080. Now, this difference will be cut to less than 5 months, and it's "only" a new fab, not even a whole new series of cards like Navi will be.

Fluffmeister, post: 3974252, member: 101373"
I'd dredge up some of your previous hype posts, but I can't be bothered anymore.

Looks like your about to enjoy 1080 Ti performance two years later for... *drum roll* $699. If that makes you happy i guess the quote of the day is: "shame on you".
Can you understand that this is the same Vega on smaller fab? Hope you hate the 2080 that much as it only brings 1080Ti performance. Hopeless tries from you.

sergionography, post: 3974364, member: 102909"
I mean to be fair, rtx2080 performance but with 300w tdp on 7nm is indeed very underwhelming. Rtx2080 is on 12nm and has a 225TDP, so that part of his statement I actually agree with. The freesync statement however is just petty
You can gain 70-80W from switching BIOSes on a Vega 64, while losing only 1-2% of performance. If we assume that this will be available for the Radeon VII, the so wanted performance/watt ratio by Nvidia owners (as they cannot say anything else) will be close to the 2080. And remember, this is the same Vega, not Navi. If you know what I mean. You can say anything, but bringing products even only reaching the second fastest cards of the other company with only a fragment of cash available is quite an achievement.

Anymal, post: 3974134, member: 158578"
No halving thank to dlss
And losing detail. Thank you.

Midland Dog, post: 3974477, member: 168254"
2070 and 2060 aint 799
He was speaking of performance, not prices. Anyway, all 3 had increased by $100-130$ while not even having more VRAM.
Posted on Reply
#213
theoneandonlymrk
mindbomb, post: 3974272, member: 148781"
The scary thing is that I don't think anyone would notice if many freesync monitors were performing poorly. Enthusiasts are relying heavily on anecdotal information, and reviewers only sporadically cover monitor releases, and often only check performance with a fixed refresh rate only.
What so screen tearing, which freesync was made for wouldn't give the game away if it's not working??.
People would notice.
Posted on Reply
#214
Vayra86
mindbomb, post: 3974272, member: 148781"
The scary thing is that I don't think anyone would notice if many freesync monitors were performing poorly. Enthusiasts are relying heavily on anecdotal information, and reviewers only sporadically cover monitor releases, and often only check performance with a fixed refresh rate only.
Actually no, there are in depth Youtube vids about just about everything and when there are doubts, some reddit or YT or Twitch channel will explode because there's a new daily shitstorm to click on.
Posted on Reply
#215
Camm
This strikes me more that Huang is somewhat scared of what's coming down the pipeline.

Nvidia got forced out of its lucrative Gsync business because otherwise it wouldn't be able to certify against HDMI 2.1.

Nvidia's again pushed out of the console space (yes we know theres the Switch, but the Switch won't be defining GPU trends), which will continue to see Nvidia technology only bolted on, rather than core to any game experience. In 6-12 months we will see two new consoles built on AMD Navi GPU's, and corresponding cards for desktop. This is new, as traditionally console GPU's have lagged behind desktop tech, which will invariably benefit AMD for optimisation.

And lastly, with Intel entering the GPU space, Nvidia is left out in the cold with a lack of x86 ability, and only so much of that can be countered with its investment in RISC.

Full disclosure - I own a 2080 Ti, but to me this comes across as a temper tantrum rather than anything meaningful
Posted on Reply
#216
Aaron_Henderson
Meh...was ready for another green card ( was thinking of going GTX 1070) but I just can't support this type of marketing bull...I'll just buy another used card...they've already made their money on it and none of mine goes towards bickering back and forth. Anyone got a used upgrade for my 290x or my gf's GTX 770? Lol. Seriously only buying used cards from here on out.
Posted on Reply
#217
Valantar
B-Real, post: 3974504, member: 170068"
Can you understand that this is the same Vega on smaller fab? Hope you hate the 2080 that much as it only brings 1080Ti performance. Hopeless tries from you.
While I largely agree with what you're saying in this post, it's worth pointing out that Vega 20 has some architectural changes from Vega 10 - they're just focused on compute and AI, not gaming. As such, it's not "the same Vega", but rather a tweaked and optimized part that they've now decided to launch for a different market - likely due to there being more stock than expected, production being cheaper than expected, wanting a higher-end card to compete until Navi arrives, or all of the above. If the performance holds up, I don't mind them selling repurposed compute architecture based cards for gaming, but I'm very much looking forward to the gaming optimized Navi. Die sizes would at least be smaller due to stripping out FP64 cores and similar non-gaming features. Should leave room for improvements and still make it relatively cheap to produce.
Posted on Reply
#218
juiseman
If I had money to burn; I would buy AMD stock right now.
I think that they will more than tripple in price at the rate they are going.
Just my opinion; but also based on the inovation of their recent CPU product's.

Havn't bought an AMD cpu since 2009 (I'm all intell Xeon's for my 7 RIgs right now)..
So Not an AMD fanboy, but clearly they have somthing good going in their CPU department...

The evidence of this is based on Intel majically boosting thier core count and
Frequency on their CPU's to the max; just to stay a touch ahead of AMD's CPU's.
To me that says it all..
Posted on Reply
#219
Vayra86
Nxodus, post: 3973989, member: 183665"
Stock market is a volatile market, even the Dow Jones has had some bad days, so what, should all americans commit sudoku when it falls 5%?
Sorry but that one is going in my sig. that was no autocarrot! :roll:
Posted on Reply
#220
xtreemchaos
[USER=182553]juiseman[/USER]
what is a fanboy ? and will it cool my threadripper ? :)
Posted on Reply
#221
Valantar
xtreemchaos, post: 3974567, member: 183324"
[USER=182553]juiseman[/USER]
what is a fanboy ? and will it cool my threadripper ? :)
I believe you'll need a waterboy for that.
Posted on Reply
#222
FordGT90Concept
"I go fast!1!11!1!"
What he said: "Freesync Doesn't Work"
What he meant: "FreeSync doesn't work on NVIDIA cards, because we don't want it to."
Posted on Reply
#223
juiseman
My spelling is atrocious..my thoughts are faster than typing.
I blame google for making me extra lazy.
ha...
Posted on Reply
#224
Durvelle27
Someone’s oblivious knows that AMD is about to kick them really hard with navi
Posted on Reply
#225
tvamos
qubit, post: 3974261, member: 46003"
AFAIK, G-SYNC is the technically superior solution...

Well, for such a premium price difference it should be. But it ain't worth as much.
Posted on Reply
Add your own comment