• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA CEO Jensen Huang on Radeon VII: "Underwhelming (...) the Performance is Lousy"; "Freesync Doesn't Work"

AFAIK, G-SYNC is the technically superior solution...

Well, for such a premium price difference it should be. But it ain't worth as much.
 
Someone’s oblivious knows that AMD is about to kick them really hard with navi
Mum's the word on Navi. I don't think it's imminent. PS5 releases anywhere between Christmas 2019 and 2021. I think Christmas 2019 is probably the soonest we'll see a consumer Navi graphics card. Makes one wonder what AMD is going to release between now and then. Polaris 40 on 7nm?

I don't think Radeon VII really concerns Huang so much as the road ahead for NVIDIA is rough. AI isn't taking off like they banked on it doing. RTX reception was poor. 7nm transition for NVIDIA is going to be very costly. Cryptocurrency mining on GPUs is mostly dead. FreeSync integration is rougher than he expected. The sky looks cloudy to Huang where the sky over Su looks sunny.
 
Last edited:
well i dont think huang is scare.noway.

truth is that amd brand new flagship gpu cant challenge even nvidia 3rd faster gpu, not even close,so even amd compare it 2080 gpu...hehe, so amd compare they flagship not nvidia flagshp, but 3 rd fastest gpu...2080 nad hypeting THAT!! come on! THINK!!
i remembe same issue was when vega64 release...amd compare it gtx 1080..NOT gtx 1080 ti..think and i hope when amd then also hypeting vega64 very high...

truth is that without 7nm update radeon VII is melting,thats fact. so its need 3 fans toREFERENCE gpu,never happend bfore gpu history, just like vega64 must use watercool!


and radeon VII even its make 7nm tech STILL eat more than 300W power and i be sure real power is somewhere 350W and peak 400W its terrible score!


think ppl cant understand how big big handicap amd radeon VII have bcoz it build 7nmm architecture... THINK!! its almost half of rtx 2000 series 12nm,so its should eat much less power what is now eating....something is very wrong, and it is that amd cant build good efficiency gpu.

example rtx 2080 eat 220W for average gaming,radeon VII score is over 300W ,EVEN its make 7nm lines!!!!!!!!

its lausy gpu like it little brother vega64, and i call it vega64 II.

rtx 2060 is 10 times better gpu. if we took fastest gpus, we took rtx 2080,rtc 2080ti and of coz rtx titan.

looks that radeon VII just and just avoid watercool,let see how its oc'd.. if it is power eat go absolutly dky high. eve its 7nm gpu.

lausy gpu again, and lisa know it and huang know it.


we e seen it soon or do i say afer 12 month when nvidia release thys 7nm gpu...i promise , under 250W and performance is 2080ti + 40% at least.

dont lie urself, see truth on eye, amd want every1 have 1kw psu. or amd not care. amd should thanks and bed deep another company TMSC that they brink it 7nm arhitecture,without that help i think amd not even release radeon VII.

lausy gpu. shame amd, and ppl wake up, amd not deserve anything smpaty,its earth eat gpu, slow and for it performace its pricey.
 
Last edited:
gamerman
you can get tablets for nvidia addiction :) sounds like you have the same as poor mr Huang matey, times are a changing... you know im joking right.
 
gamerman
you can get tablets for nvidia addiction :) sounds like you have the same as poor mr Huang matey, times are a changing... you know im joking right.
I believe the tablet for Nvidia addiction is called Nintendo Switch.



... I'll see myself out.
 
Valantar
lol how did i miss that, come back in i like you humor.
 
AMD fanbois will be so angry when in a couple of years RTX becomes standard, and AMD will be forced to adopt it effectively increasing the prices of their hot plastic cards even more.
You know, that is a good point, they invested in RTX and maybe or maybe not it'll play out well for them. "Standard" is also a good question, since DX12 introduced the RTX API in Win 10, it's still very young. It's all about the ecosystem, meaning how many developers are going to invest time into implementing it when there's only a handful (or maybe one right now... 2080 ti) that can really handle it.

It's not a very compelling story at the moment.

Guys, you realize that currently, the only implementation of RTRT on desktop is Windows 10, with 1809 update, using DXR as the library to display said technology? You know, the library that anyone can use to accelerate ray tracing? It's not "RTX API". It's Microsofts' DXR API, that is a part of DirectX. Once AMD implements Radeon Rays, they'll be using the same library.

I've just got a 55'' samsung tv with freesync built in 120fps with my r9 290x only at 1080p but works for me. WITH MY LEATHER JACKET ON LOL.....

Loving my Q6 so far. Been a great TV

No halving thank to dlss

Man, your guys reliance on DLSS being an end all solution is bewildering. Personally, I'd never rely on a scaling solution in my gaming. I'd rather just run at native res, for better or worse. Beyond that, we're already getting 1080p 60fps in the only RTX supported title so far, with cards that are lower in the stack than the range topping RTX Titan and RTX 2080 Ti.



Beyond all that, the Radeon 7 does look good. So what, it's just a shrunk and higher clocked Vega. But it has twice the ROPS as previous, a huge increase that hamstrung the first Vega, and it has faster HBM, and twice as much as last time as well.

It's $700 because NVIDIA allowed them to price it at that; I can hardly blame AMD for taking advantage of NVIDIA's absofuckinglutely insane pricing.
 
Guys, you realize that currently, the only implementation of RTRT on desktop is Windows 10, with 1809 update, using DXR as the library to display said technology? You know, the library that anyone can use to accelerate ray tracing? It's not "RTX API". It's Microsofts' DXR API, that is a part of DirectX. Once AMD implements Radeon Rays, they'll be using the same library.



Loving my Q6 so far. Been a great TV



Man, your guys reliance on DLSS being an end all solution is bewildering. Personally, I'd never rely on a scaling solution in my gaming. I'd rather just run at native res, for better or worse. Beyond that, we're already getting 1080p 60fps in the only RTX supported title so far, with cards that are lower in the stack than the range topping RTX Titan and RTX 2080 Ti.



Beyond all that, the Radeon 7 does look good. So what, it's just a shrunk and higher clocked Vega. But it has twice the ROPS as previous, a huge increase that hamstrung the first Vega, and it has faster HBM, and twice as much as last time as well.

It's $700 because NVIDIA allowed them to price it at that; I can hardly blame AMD for taking advantage of NVIDIA's absofuckinglutely insane pricing.
The main difference here: AMD's range tops out at $700. Regardless of relative and absolute performance, that's within reason, I'd say. I paid that much for my Fury X, and I'm happy with it, but it would take a serious improvement to make me pay as much again, at least for a couple of years yet. Nvidia's range, on the other hand, tops out at either $1200 or $2500, depending if you include the Titan. I don't think it counts (it's not GeForce), but apparently rabid fanboys such as the above example disagree (edit: to clarify, not the post quoted above, but the one you've all noticed if you've read the last page of posts). That is well beyond "within reason". People were pissed when Nvidia pushed Titan pricing to $1200, yet now they're eating up the 2080 ti at the same price. Just shows how easily accustomed one gets to insanity.
 
Last edited:
I bought a Water block for my Sapphire Vega 64 card. I do seriously hope that the layout is the same as when I ran Fire Strike this morning
The main difference here: AMD's range tops out at $700. Regardless of relative and absolute performance, that's within reason, I'd say. I paid that much for my Fury X, and I'm happy with it, but it would take a serious improvement to make me pay as much again, at least for a couple of years yet. Nvidia's range, on the other hand, tops out at either $1200 or $2500, depending if you include the Titan. I don't think it counts (it's not GeForce), but apparently rabid fanboys such as the above example disagree. That is well beyond "within reason". People were pissed when Nvidia pushed Titan pricing to $1200, yet now they're eating up the 2080 ti at the same price. Just shows how easily accustomed one gets to insanity.


Exactly, and people are talking about DLSS and Ray Tracing without Physx, Hairworks and SLI which Nvidia took over and hogged for themselves only to see the technology not used because of the way they do things vs AMD with Vulkan which let to DX12 or Freesync which suddenly Nvidia is supporting. Of course with a comment for Jensen making it seem like Freesync is only good for NVidia cards..
 
It's Microsofts' DXR API, that is a part of DirectX. Once AMD implements Radeon Rays, they'll be using the same library.

Yes, yes and yes. It's a component of DX in the latest Windows 10 version that everyone has access to.

The catch is that game developers have to write code to implement it, and the hardware has to be able to process it as well. Currently, these games are in the works to support it, at least to some degree or another:
  • Assetto Corsa Competizione from Kunos Simulazioni/505 Games.
  • Atomic Heart from Mundfish.
  • Battlefield V from EA/DICE.
  • Control from Remedy Entertainment/505 Games.
  • Enlisted from Gaijin Entertainment/Darkflow Software.
  • Justice from NetEase.
 
What so screen tearing, which freesync was made for wouldn't give the game away if it's not working??.
People would notice.

The screen tearing is fixed with freesync, people aren't disputing that. What nvidia is claiming is that many freesync displays have dealbreaking image artifacts over their variable refresh range. Obviously, they have an incentive to upsell people to gsync monitors, but I think they might be right anyway. Monitor reviews are sporadic, and often focus on fixed refresh rate performance. It is completely plausible to me that problems have went unnoticed.

Actually no, there are in depth Youtube vids about just about everything and when there are doubts, some reddit or YT or Twitch channel will explode because there's a new daily shitstorm to click on.

There have been problems noted. This pcperspective article notices excessive ghosting on early freesync monitors.
https://www.pcper.com/reviews/Displ...hnical-Discussion/Gaming-Experience-FreeSync-
They released a follow up article claiming the issue was fixed, but the ghosting was replaced by arguably more noticeable overshoot artifacts.

PCmonitors.info noted that the Samsung c24fg70 had overshoot artifacts with freesync at 100hz, and flickering on the desktop with freesync on.
https://pcmonitors.info/reviews/samsung-c24fg70/

Monitor reviews aren't as exciting as gpu or cpu reviews, and these things can easily go unnoticed by the community. Those overshoot and ghosting issues would be solved by dynamic overdrive, something nvidia requires for all gsync monitors.
 
Last edited:
The screen tearing is fixed with freesync, people aren't disputing that. What nvidia is claiming is that many freesync displays have dealbreaking image artifacts over their variable refresh range. Obviously, they have an incentive to upsell people to gsync monitors, but I think they might be right anyway. Monitor reviews are sporadic, and often focus on fixed refresh rate performance. It is completely plausible to me that problems have went unnoticed.



There have been problems noted. This pcperspective article notices excessive ghosting on early freesync monitors.
https://www.pcper.com/reviews/Displ...hnical-Discussion/Gaming-Experience-FreeSync-

They released a follow up article claiming the issue was fixed, but the ghosting was replaced by arguably more noticeable overshoot artifacts.

PCmonitors.info noted that the Samsung c24fg70 had overshoot artifacts with freesync at 100hz, and flickering on the desktop with freesync on.

https://pcmonitors.info/reviews/samsung-c24fg70/

Monitor reviews aren't as exciting as gpu or cpu reviews, and these things can easily go unnoticed by the community.

So you have 450 ish more examples (of different monitor) to find and Huang is then right, crack on.
He did say they're All broken.

And you Are hear trying to back his words, Have you seen this issue in the flesh??????

Everyone with some sense would expect the odd monitor or even line of monitor to have issues possibly but to say they're all broken ,well.

My Samsung works without artefacts etc but tbf it is a 45/75hz one so I can see it's not that model.
 
"Underwhelming. The performance is lousy."

Kinda funny since that's exactly what I thought when reading the Turing reviews.
 
The screen tearing is fixed with freesync, people aren't disputing that. What nvidia is claiming is that many freesync displays have dealbreaking image artifacts over their variable refresh range. Obviously, they have an incentive to upsell people to gsync monitors, but I think they might be right anyway. Monitor reviews are sporadic, and often focus on fixed refresh rate performance. It is completely plausible to me that problems have went unnoticed.



There have been problems noted. This pcperspective article notices excessive ghosting on early freesync monitors.
https://www.pcper.com/reviews/Displ...hnical-Discussion/Gaming-Experience-FreeSync-
They released a follow up article claiming the issue was fixed, but the ghosting was replaced by arguably more noticeable overshoot artifacts.

PCmonitors.info noted that the Samsung c24fg70 had overshoot artifacts with freesync at 100hz, and flickering on the desktop with freesync on.
https://pcmonitors.info/reviews/samsung-c24fg70/

Monitor reviews aren't as exciting as gpu or cpu reviews, and these things can easily go unnoticed by the community. Those overshoot and ghosting issues would be solved by dynamic overdrive, something nvidia requires for all gsync monitors.
There is no need to rationalize all that much after the debate, here you go;
C24FG70-blur.png
 
Yup, like RTX and DLSS …

View attachment 114314

Smartcom

loool

You werent there when the first T&L gpu came out, a!

mybe he wasn't, but it looks like you weren't there when the first shading GPU (R300) came out

- and this was 3 years ago... they only got worse.

AMD doesn't lock features behind hardware, they make open-source platforms.

they don't do it for you, or should I say us, they do it becouse it help them save money from software development expenses, it is also the reason they have shit gpu drivers
 
The screen tearing is fixed with freesync, people aren't disputing that. What nvidia is claiming is that many freesync displays have dealbreaking image artifacts over their variable refresh range. Obviously, they have an incentive to upsell people to gsync monitors, but I think they might be right anyway. Monitor reviews are sporadic, and often focus on fixed refresh rate performance. It is completely plausible to me that problems have went unnoticed.



There have been problems noted. This pcperspective article notices excessive ghosting on early freesync monitors.
https://www.pcper.com/reviews/Displ...hnical-Discussion/Gaming-Experience-FreeSync-
They released a follow up article claiming the issue was fixed, but the ghosting was replaced by arguably more noticeable overshoot artifacts.

PCmonitors.info noted that the Samsung c24fg70 had overshoot artifacts with freesync at 100hz, and flickering on the desktop with freesync on.
https://pcmonitors.info/reviews/samsung-c24fg70/

Monitor reviews aren't as exciting as gpu or cpu reviews, and these things can easily go unnoticed by the community. Those overshoot and ghosting issues would be solved by dynamic overdrive, something nvidia requires for all gsync monitors.

Sorry but no. There are terrible monitors with Gsync too. Has nothing to do with VRR and everything with the panel and board behind that.

Tftcentral has it all in case you are curious. Not once has Freesync been considered related to ghosting or artifacting. The only gripe is limited ranges.
 
-While I agree that Vega as an architecture is far better suited for professional applications (we've seen this proven plenty of times), apparently AMD feels confident enough in the gaming-related improvements when doubling the memory bandwidth and doubling the number of ROPs to give it a go. I suggest we wait for reviews.

vega IS far better suited for (professional) compute applications
this is not about what AMD feels
memory bandwidth gives nothing to gaming performance on vega
it is not double the ROPs, they are still 64
but guess what ? I also suggest we waith for reviews

Ultimately this is good news, finally Vega can compete with the 1080 Ti. Let's wait for the review and see if TSMC's 7nm is the silver bullet many had hoped for.

it's not really 7 nm, actual size is closer to intel's 10 nm.

I think I know whay he says freesync doesn't work "at all" on "all" monitors, I suspect he considers the concept of a monitor not running the same hertz range in freesync as it does without it.
which makes him right about that since it is a lie sold by specifications and makes people believe freesync is full range, this says something to many people here that say AMD is not as dirty as nvidea...

Sasqui Assetto CC have shit for graphics ray tracing is not going to do it any good
 
Last edited:
vega IS far better suited for (professional) compute applications
Yes, that's what I said. Kind of weird to try to make a point against me be repeating what I said like that. Is a snarky tone supposed to make it mean something different? This is a fact. Doesn't mean that it can't work for gaming. I was very surprised to see Vega 20 arrive as a gaming card, but apparently it performs well for this also. Nothing weird about that.
this is not about what AMD feels
Considering they're the only ones with any knowledge of how this performs in various workloads, it very much is. Launching it is also 100% AMD's decision, so, again this comes down to how AMD feels about it.
memory bandwidth gives nothing to gaming performance on vega
Source?
it is not double the ROPs, they are still 64
That seems to be true, given the coverage of how this was an error in initial reports. My post was made before this was published, though.

it's not really 7 nm, actual size is closer to intel's 10 nm.
Node names are largely unrelated to feature size in recent years - there's nothing 10nm in Intel's 10nm either - but this is still a full node shrink from 14/12nm. Intel being slightly more conservative with node naming doesn't change anything about this.

I think I know whay he says freesync doesn't work "at all" on "all" monitors, I suspect he considers the concept of a monitor not running the same hertz range in freesync as it does without it.
If that's the case, he should look up the definition of "work". At best, that's stretching the truth significantly, but I'd say it's an outright lie. What you're describing there is not "not working".
which makes him right about that since it is a lie sold by specifications and makes people believe freesync is full range, this says something to many people here that say AMD is not as dirty as nvidea...
Why is that? AMD helped create an open standard. How this is implemented is not up to them, as long as the rules of the standard are adhered to (and even then, it's VESA that owns the standard, not AMD). AMD has no power to police how monitor manufacturers implement Freesync - which is how it should be, and not Nvidia's cost-adding gatekeeping. AMD has never promised more than tear-free gaming (unless you're talking FS2, which is another thing entirely), which they've delivered. AMD even has a very helpful page dedicated to giving accurate information on the VRR range of every single FreeSync monitor.

Also, your standard for "as dirty as" seems... uneven, to say the least. On the one hand, we have "created an open standard and didn't include strict quality requirements or enforce this as a condition for use of the brand name", while on the other we have (for example) "attempted to use their dominant market share to strong-arm production partners out of using their most famous/liked/respected/recognized brand names for products from their main competitor". Do those look equal to you?
 
you mean feels as in "it good enough to go out to the market" ?
yhe sure it can work in gaming, first vegas proved that :), im saying it like that becouse wrong chip for the wrong purpose, mybe better to augment polaris ?
you right AMD is better becouse they use the open standards approach, that's not the problem. for me the problem is that they are not doing it for good reasons, for us, it becouse they are doing it for themselfs...
thanks man that very elaborate response, you cleaned up a few things :)
I didn't know the freesync problems are becouse of the monitors not AMD or the freesync system itself
everyone are liers today about the nm eh ?
 
Last edited:
I have a Vega 64 and I assure you that memory bandwidth isn't helping gaming. 20% overclock on HBM yields no tangible performance benefit with mine. These cards are constrained by power consumption, not memory bandwidth.
 
This strikes me more that Huang is somewhat scared of what's coming down the pipeline.

Nvidia got forced out of its lucrative Gsync business because otherwise it wouldn't be able to certify against HDMI 2.1.

Nvidia's again pushed out of the console space (yes we know theres the Switch, but the Switch won't be defining GPU trends), which will continue to see Nvidia technology only bolted on, rather than core to any game experience. In 6-12 months we will see two new consoles built on AMD Navi GPU's, and corresponding cards for desktop. This is new, as traditionally console GPU's have lagged behind desktop tech, which will invariably benefit AMD for optimisation.

And lastly, with Intel entering the GPU space, Nvidia is left out in the cold with a lack of x86 ability, and only so much of that can be countered with its investment in RISC.

Full disclosure - I own a 2080 Ti, but to me this comes across as a temper tantrum rather than anything meaningful

Have not heard of this: source link, please?
 
Have not heard of this: source link, please?
The HDMI 2.1 spec includes VRR support as a requirement, which would prevent Nvidia from certifying its cards against that spec if they still limited VRR support to G-sync. I guess a "middle of the road" option would be possible by supporting VRR on non-G-sync displays only if they are HDMI 2.1 displays, but that would just lead to (yet another) consumer uproar against Nvidia. Let alone the fact that (at least according to all reports I've seen) the HDMI 2.1 implementation of VRR is technically very, very close to AMD's existing extension of VESA Adaptive Sync To FreeSync over HDMI. Nvidia would have no technical excuse whatsoever to not support FreeSync.
 
Back
Top