Thursday, January 10th 2019

NVIDIA CEO Jensen Huang on Radeon VII: "Underwhelming (...) the Performance is Lousy"; "Freesync Doesn't Work"

PC World managed to get a hold of NVIDIA CEO Jensen Huang, picking his thoughts on AMD's recently announced Radeon VII. Skirting through the usual amicable, politically correct answers, Jensen made his thoughts clear on what the competition is offering to compete with NVIDIA's RTX 2000 series. The answer? Vega VII is an "underwhelming product", because "The performance is lousy and there's nothing new. [There's] no ray tracing, no AI. It's 7nm with HBM memory that barely keeps up with a 2080. And if we turn on DLSS we'll crush it. And if we turn on ray tracing we'll crush it." Not content on dissing the competition's product, Jensen Huang also quipped regarding AMD's presentation and product strategy, saying that "It's a weird launch, maybe they thought of it this morning."
Of course, the real market penetration of the technologies Jensen Huang mentions is currently extremely low - only a handful of games support NVIDIA's forward-looking ray tracing technologies. That AMD chose to not significantly invest resources and die-space for what is essentially a stop-gap high-performance card to go against NVIDIA's RTX 2080 means its 7 nm 331 mm² GPU will compete against NVIDIA's 12 nm, 545 mm² die - if performance estimates are correct, of course.
The next remarks came regarding AMD's FreeSync (essentially a name for VESA's Adaptive Sync), which NVIDIA finally decided to support on its GeForce graphics cards - something the company could have done outright, instead of deciding to go the proprietary, module-added, cost-increased route of G-Sync. While most see this as a sign that NVIDIA has seen a market slowdown for its G-Sync, added price-premium monitors and that they're just ceding to market demands, Huang sees it another way, saying that "We never competed. [FreeSync] was never proven to work. As you know, we invented the area of adaptive sync. The truth is most of the FreeSync monitors do not work. They do not even work with AMD's graphics cards." In the wake of these word from Jensen, it's hard to understand the overall silence from users that might have their FreeSync monitors not working.

Reportedly, NVIDIA only found 12 out of 400 FreeSync-supporting monitors to support their G-Sync technology automatically in the initial battery of tests, with most panels requiring a manual override to enable the technology. Huang promised that "We will test every single card against every single monitor against every single game and if it doesn't work, we will say it doesn't work. And if it does, we will let it work," adding a snarky punchline to this matter with an "We believe that you have to test it to promise that it works, and unsurprisingly most of them don't work." Fun times. Source: PC World
Add your own comment

270 Comments on NVIDIA CEO Jensen Huang on Radeon VII: "Underwhelming (...) the Performance is Lousy"; "Freesync Doesn't Work"

#26
tvamos
ShurikN, post: 3973759, member: 140585"
As far as I understood the whole Freesync launch, most issues were on the very earliest of panels (and probably drivers) After a couple of months the tech worked like intended.
Maybe Jensen only tested those early ones. All 400 of them.
Posted on Reply
#27
ShurikN
tvamos, post: 3973772, member: 162758"
Maybe Jensen only tested those early ones. All 400 of them.
Made me laugh not gonna lie :laugh:
Posted on Reply
#28
Xaled
megamanxtreme, post: 3973742, member: 133503"
Are those responses typical from this guy?
He is the richest hater, troll on earth
The richer he gets the meaner he becomes..

ShurikN, post: 3973773, member: 140585"
Made me laugh not gonna lie :laugh:
He hates on something that he bought and tested 400 of ..:laugh::D
Posted on Reply
#29
Sasqui
This speaks to lack of integrity and self serving interests among other things. Just wait for the next NVDA earnings release in early Feb... though the share price already has most of the downed estimates baked in.

NVDA makes great products but the leadership and culture of the company has been and still is, one big money grab by way of coercion. From SLI to G-Sync, can you say "proprietary"?
Posted on Reply
#30
Octopuss
This almost looks like a new low. Just shut up maybe? :-O
Posted on Reply
#31
kastriot
Inferiority complex can be devastating and more money doesn't help ;)
Posted on Reply
#32
XiGMAKiD
The hyped raytracing capability is not that good considering they need to reduce so many raytraced elements to get acceptable framerates on all cards. DLSS is a better feature to brag about as it can increase performance and make the game a bit prettier, and I love how he twist the wording about condition on support for Adaptive Sync to make them looks good, 11/10 a legit CEO :laugh:
Posted on Reply
#33
Vayra86
tvamos, post: 3973772, member: 162758"
Maybe Jensen only tested those early ones. All 400 of them.
Against every game, no less.

https://www.polygon.com/2014/4/20/5633602/list-of-every-video-game-all-time

That makes for a little over 160000 FreeSync tests - and if you consider the rate at which Steam spits out indies, they better have a really efficient test sequence :)

Yeah, its a good day :D We needed some spice to forget all our GPU woes.
Posted on Reply
#34
Xuper
hrmm, What happened? so much angry ?
Posted on Reply
#36
Vayra86
XiGMAKiD, post: 3973781, member: 161561"
The hyped raytracing capability is not that good considering they need to reduce so many raytraced elements to get acceptable framerates on all cards. DLSS is a better feature to brag about as it can increase performance and make the game a bit prettier, and I love how he twist the wording about condition on support for Adaptive Sync to make them looks good, 11/10 a legit CEO :laugh:
Not sure about DLSS, did you see it? With old AA everyone complained about every odd jaggy or blur that resulted from using it, and now we're fine with all sorts of visual artifacting? For 'free performance'? What you're seeing is a downsampled render that can be very inaccurate and needs to be implemented on a per-game basis. I'll take the usual AA, thanks. What's even better is that the ONE game in which it actually works, all development was discontinued recently :D
Posted on Reply
#37
XiGMAKiD
megamanxtreme, post: 3973787, member: 133503"
Reminded me how Xbox 360 had this feature, not sure why AMD didn't invest on it, would really help free up resources(or was just speculation on the console).
https://www.anandtech.com/show/1864/inside-microsoft-s-xbox-360/8
My guess is at least on console to give room for upscaling feature, as for the PC well maybe the available AA technique is enough for them right now, although if implemented it can give them a healthy boost on performance.
Posted on Reply
#38
27MaD
"underwhelming , Performance is lousy , Freesync doesn't work" ? what ? , did you expect him to say the new Radeon 7 kicks A$$?:wtf:
someone is shaking.:cry::twitch:
Posted on Reply
#39
megamanxtreme
Vayra86, post: 3973793, member: 152404"
What you're seeing is a downsampled render that can be very inaccurate and needs to be implemented on a per-game basis. I'll take the usual AA, thanks.
Checkerboard in consoles, some can make them look okayish but other times its completely noticeable, the FF part with the fence in the distance flickering. If smoothing makes the textures look better, I'm all in. If not, I'll stick to 4K textures and native 1080p, 1440p, and 4K.
Posted on Reply
#40
ShurikN
XiGMAKiD, post: 3973781, member: 161561"
DLSS is a better feature to brag about...
I thought the same thing until I saw how much blur that tech actually creates.
Posted on Reply
#41
phanbuey
Yeah the new nvidia 'features' are experimental at best. Great potential though but i never was a fan of upsamling/downsampling. I do like TXAA though. Ray tracing will be a thing in like 3 years.
Posted on Reply
#42
Vayra86
phanbuey, post: 3973799, member: 45008"
Yeah the new nvidia 'features' are experimental at best. Great potential though but i never was a fan of upsamling/downsampling. I do like TXAA though. Ray tracing will be a thing in like 3 years.
That's just it, TSSAA is miles better and it can scale along with an internal render resolution slider. Its a one-size fits all approach for every level of performance and it is already implemented in almost every recent game.

As for RT... until I see AMD announce their next console APU that has hardware RT capability, I don't see any movement in that department and deep down we all know that is what the tech needs for mass adoption and, thus, support in content. Nvidia can't even take that cake because they don't make x86 CPUs and don't do custom silicon.
Posted on Reply
#43
theoneandonlymrk
ZoneDymo, post: 3973724, member: 66089"
Someone seems a little bitter.... and afraid or something.
Like seriously, I dont say this as some jab, if you are comfortable with your own products and choices, you would not try to put the competition down like this.
Seconded , bang on , nothing short of caty.
Freesync works fine and has this last year for me go figure, Im that lucky i guess.
The guy should get his focus back, his company probably lost in worth , AMDs actual worth and he's being investigated for ineptitude or deception (tbd) so should probably be working to increase adoption by devs of Rtx imho, not this shit banter.
Posted on Reply
#44
XiGMAKiD
Vayra86, post: 3973793, member: 152404"
Not sure about DLSS, did you see it? With old AA everyone complained about every odd jaggy or blur that resulted from using it, and now we're fine with all sorts of visual artifacting? For 'free performance'? What you're seeing is a downsampled render that can be very inaccurate and needs to be implemented on a per-game basis. I'll take the usual AA, thanks. What's even better is that the ONE game in which it actually works, all development was discontinued recently :D
Well if Nvidia is fine about raytraced games that is raytraced on bare minimum then visual weirdness here and there is fine for them too :laugh:
Posted on Reply
#45
Imsochobo
Hellfire, post: 3973739, member: 84153"
Wow, I'd expect a CEO of a multi billion dollar company not to whine like a little bitch.

I mean I've seen Emo kids who are less whiney than he is.
What I hate most about nvidia:
Jensen.
Linux driver support.

With freesync now supported one BIG ass contributing factor for me to purchase a Vega 64 @gtx1070 price was that and still felt like having shit tons of linux issues, no adaptive sync and buying a 1080 at 100$ more but closed my eyes and purchased it.
Still not happy about the Vega Hardware, that's the only thing I'm not happy about with the AMD Gpu.

If only AMD Knew how to make good gaming gpu chips again.......
Posted on Reply
#46
Anymal
unikin, post: 3973767, member: 182654"
I'm staying in $250 buying ballpark no matter what. If Nvidia charges $250 for RTX 2050 with GTX 1060 performance, I'll just skip it and stay with my OC RX 570 until something comparable to GTX 1070/Vega 56 comes along with $250 price tag or less. That's how you say no to ridiculous pricing of low mid end end products like x060 or 590. I don't care about red or green, I'll buy best price/performance ratio GPU inside my budget. If no progress has been made on price/performance front, I'll skip generation.
If you dont buy new card you have infinite p/p, no money spent and rx 570 stays, the best deal ever.
Posted on Reply
#47
Dimi
Well it is pretty lousy if you see the dozen different freesync ranges on all these monitors. Most of em only have a freesync range between 48hz-75hz. What is the point of that? There is no standard like gsync has. With gsync, the adaptive sync starts from 30hz up to its maximum refresh rate, almost every gsync monitor has a adaptive sync range of 30hz-144hz.

Mind you, my 1440p 165hz gsync monitor only cost me $350. Granted its a TN panel but i use a ColorMunki display calibrator for color accuracy.
Posted on Reply
#48
Anymal
A lot of people here who do not understand rt and dlss. Yeah, rtx line is all fail, VII is the best ever, AMD power FTW
Posted on Reply
#49
Vayra86
Anymal, post: 3973811, member: 158578"
A lot of people here who do not understand rt and dlss. Yeah, rtx line is all fail, VII is the best ever, AMD power FTW
Feel free to add some content and enlighten us. What do you think 'people' don't understand about it? The tech has been covered extensively. And I think the majority understands it very well: its vaporware at this point and for the foreseeable future.
Posted on Reply
#50
ShurikN
Anymal, post: 3973811, member: 158578"
A lot of people here who do not understand rt and dlss.
What do you mean "do not understand". Did you see DLSS in FF15, it's a mess. What's there to not understand. The evidence is obvious. Not to mention the horrible shimmering.
Posted on Reply
Add your own comment