• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA CEO Jensen Huang on Radeon VII: "Underwhelming (...) the Performance is Lousy"; "Freesync Doesn't Work"

This, highly undignified response and clearly means he's worried.

No doubt, with worse performance per watt than Pascal despite being built on 7nm tech he is wondering when the competiton will turn up.
 
Shame on you Leather :mad:

DXkyWlZ.jpg
 
The most important thing is your RTX series is lousy in terms of performance jump and especially in price. After these comments some more stock decrease is inevitable. :D

No doubt, with worse performance per watt than Pascal despite being built on 7nm tech he is wondering when the competiton will turn up.

I didn't know the most important thing is performance per Watt. LOL. :D Shame on you.

A lot of people here who do not understand rt and dlss. Yeah, rtx line is all fail, VII is the best ever, AMD power FTW

HAHA. Whine little one, whine. What you can do is use RT in ONE game in 4 months time. And the effect of RT was officially confirmed to have been lowered for somewhat better performance, which meant ~70 fps in FHD and ~30 fps average in 4K (with one of the best optimized engines) with a $1000 (1200) card. After the patch, the effects were further lowered in order to get better results. Still, its 50-55 average in 4K with a $1000 (1200) card. :D Tomb Raider will get it with FHD 30ish fps on 2080Ti. FFXV RT was cancelled. Now these are shame.
 
Last edited by a moderator:
I'd dredge up some of your previous hype posts, but I can't be bothered anymore.

Looks like your about to enjoy 1080 Ti performance two years later for... *drum roll* $699. If that makes you happy i guess the quote of the day is: "shame on you".
 
Last edited:
It was obvious from the moment that AMD made an open standard version of G-SYNC that it would eventually beat G-SYNC into submission. AMD invited NVIDIA to support it from day one too, but NVIDIA arrogantly didn't want to, but now they're forced to. No wonder NVIDIA is pissed.

AFAIK, G-SYNC is the technically superior solution, but don't quote me.

He's made some pretty scandalous claims there about FreeSync on most of those monitors not working even with AMD cards, that sounds pretty suspicious. If it was true, enthusiasts would have been shouting about it on forums long ago and reviewers might have noticed a little bit, too. As long as he doesn't name names though they probably can't touch him.

It sounds like those non-functioning monitors simply don't work with NVIDIA cards, but he doesn't wanna admit that because it would look really bad for NVIDIA, so he includes AMD as well. So underhand, but NVIDIA's got form for this so I'm not surprised. And I'm saying this as a hardcore longtime NVIDIA user, so I'm not just being biased.
 
He's made some pretty scandalous claims there about FreeSync on most of those monitors not working even with AMD cards, that sounds pretty suspicious. If it was true, enthusiasts would have been shouting about it on forums long ago and reviewers might have noticed a little bit, too. As long as he doesn't name names though they probably can't touch him.

The scary thing is that I don't think anyone would notice if many freesync monitors were performing poorly. Enthusiasts are relying heavily on anecdotal information, and reviewers only sporadically cover monitor releases, and often only check performance with a fixed refresh rate only.
 
A mindless comment by the Nvidia Boss. Do I hear arrogance in his voice or is he threatened by AMD's upcoming arsenal.
 
I mean to be fair, rtx2080 performance but with 300w tdp on 7nm is indeed very underwhelming. Rtx2080 is on 12nm and has a 225TDP, so that part of his statement I actually agree with. The freesync statement however is just petty
 
So does he think that the 2080 has underwhelming performance? How about the 2070 or 2060 which are even slower?

The Vega VII cards aren't setting a new benchmark in performance but if they perform at the same level as the 2080 I wouldn't say that they are underwhelming at all.

Of course what do people expect the CEO of AMD's competitor to say "oh yes they're decent cards"?
2070 and 2060 aint 799
 
hes spat the dummy out poor guy. AMD talk of love and oneness for the gamer and nvidia just bitch. i am glad to say ive never brought anything from nvidia and that guy says why. charl.
 
No halving thank to dlss
Except for the resolution... ;)

Had to say it. Thanks for the entertainment guys. It's just PR anyway, Jen-hsun has always been hands on & good on him. No question AMD dropped the ball with RTG & we all know the reasons (mostly). Just shows how forward thinking a concurrent gaphics/compute uarch it was in 2012. I don't know what people expect from Navi, but it too is GCN based. Vega20 at last gives us 128 ROPs, but doesn't break the 4 tris/clock front end given the primitive discard engine bypassing GS->HS->DS->VS is a no-show. Perhaps fixed in Navi? Nvidia has turned Maxwell->Pascal->Turing into a better balanced gaming uarch esp given substantial devrel programs.
 
The issue is Nvidia just lost their domination on the high end, when they needed it the most to push their Turing affair, which is the worst deal in GPU history. They now only have one top-end card that is completely priced out of the market - out of pure necessity because its too flippin' large - to eclipse AMD's ancient GCN with some lazy tweaks and a shrink. So that, alongside them having to give up the Gsync cash cow because it had nowhere else to go, RTX not really gaining traction in hearts & minds or dev products, mining being over, and their stock price stabilizing at half what it was a few months back.

Yeah, I'd be grumpy too :)
Actually with the 10 series and Vega, it was the same, as the Vega 64 brought 1080 performance. However, it was more expensive for a long time (except for the very first days and the period after the mining craze) and it was released 1 year and 3 months later than the 1080. Now, this difference will be cut to less than 5 months, and it's "only" a new fab, not even a whole new series of cards like Navi will be.

I'd dredge up some of your previous hype posts, but I can't be bothered anymore.

Looks like your about to enjoy 1080 Ti performance two years later for... *drum roll* $699. If that makes you happy i guess the quote of the day is: "shame on you".
Can you understand that this is the same Vega on smaller fab? Hope you hate the 2080 that much as it only brings 1080Ti performance. Hopeless tries from you.

I mean to be fair, rtx2080 performance but with 300w tdp on 7nm is indeed very underwhelming. Rtx2080 is on 12nm and has a 225TDP, so that part of his statement I actually agree with. The freesync statement however is just petty
You can gain 70-80W from switching BIOSes on a Vega 64, while losing only 1-2% of performance. If we assume that this will be available for the Radeon VII, the so wanted performance/watt ratio by Nvidia owners (as they cannot say anything else) will be close to the 2080. And remember, this is the same Vega, not Navi. If you know what I mean. You can say anything, but bringing products even only reaching the second fastest cards of the other company with only a fragment of cash available is quite an achievement.

No halving thank to dlss
And losing detail. Thank you.

2070 and 2060 aint 799
He was speaking of performance, not prices. Anyway, all 3 had increased by $100-130$ while not even having more VRAM.
 
Last edited:
The scary thing is that I don't think anyone would notice if many freesync monitors were performing poorly. Enthusiasts are relying heavily on anecdotal information, and reviewers only sporadically cover monitor releases, and often only check performance with a fixed refresh rate only.
What so screen tearing, which freesync was made for wouldn't give the game away if it's not working??.
People would notice.
 
The scary thing is that I don't think anyone would notice if many freesync monitors were performing poorly. Enthusiasts are relying heavily on anecdotal information, and reviewers only sporadically cover monitor releases, and often only check performance with a fixed refresh rate only.

Actually no, there are in depth Youtube vids about just about everything and when there are doubts, some reddit or YT or Twitch channel will explode because there's a new daily shitstorm to click on.
 
This strikes me more that Huang is somewhat scared of what's coming down the pipeline.

Nvidia got forced out of its lucrative Gsync business because otherwise it wouldn't be able to certify against HDMI 2.1.

Nvidia's again pushed out of the console space (yes we know theres the Switch, but the Switch won't be defining GPU trends), which will continue to see Nvidia technology only bolted on, rather than core to any game experience. In 6-12 months we will see two new consoles built on AMD Navi GPU's, and corresponding cards for desktop. This is new, as traditionally console GPU's have lagged behind desktop tech, which will invariably benefit AMD for optimisation.

And lastly, with Intel entering the GPU space, Nvidia is left out in the cold with a lack of x86 ability, and only so much of that can be countered with its investment in RISC.

Full disclosure - I own a 2080 Ti, but to me this comes across as a temper tantrum rather than anything meaningful
 
Meh...was ready for another green card ( was thinking of going GTX 1070) but I just can't support this type of marketing bull...I'll just buy another used card...they've already made their money on it and none of mine goes towards bickering back and forth. Anyone got a used upgrade for my 290x or my gf's GTX 770? Lol. Seriously only buying used cards from here on out.
 
Can you understand that this is the same Vega on smaller fab? Hope you hate the 2080 that much as it only brings 1080Ti performance. Hopeless tries from you.
While I largely agree with what you're saying in this post, it's worth pointing out that Vega 20 has some architectural changes from Vega 10 - they're just focused on compute and AI, not gaming. As such, it's not "the same Vega", but rather a tweaked and optimized part that they've now decided to launch for a different market - likely due to there being more stock than expected, production being cheaper than expected, wanting a higher-end card to compete until Navi arrives, or all of the above. If the performance holds up, I don't mind them selling repurposed compute architecture based cards for gaming, but I'm very much looking forward to the gaming optimized Navi. Die sizes would at least be smaller due to stripping out FP64 cores and similar non-gaming features. Should leave room for improvements and still make it relatively cheap to produce.
 
If I had money to burn; I would buy AMD stock right now.
I think that they will more than tripple in price at the rate they are going.
Just my opinion; but also based on the inovation of their recent CPU product's.

Havn't bought an AMD cpu since 2009 (I'm all intell Xeon's for my 7 RIgs right now)..
So Not an AMD fanboy, but clearly they have somthing good going in their CPU department...

The evidence of this is based on Intel majically boosting thier core count and
Frequency on their CPU's to the max; just to stay a touch ahead of AMD's CPU's.
To me that says it all..
 
Stock market is a volatile market, even the Dow Jones has had some bad days, so what, should all americans commit sudoku when it falls 5%?

Sorry but that one is going in my sig. that was no autocarrot! :roll:
 
juiseman
what is a fanboy ? and will it cool my threadripper ? :)
 
What he said: "Freesync Doesn't Work"
What he meant: "FreeSync doesn't work on NVIDIA cards, because we don't want it to."
 
Someone’s oblivious knows that AMD is about to kick them really hard with navi
 
Back
Top