• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Editorial NVIDIA's Weakness is AMD's Strength: It's Time to Regain Mid-Range Users

"This new silicon is allegedly being built on TSMC's 12 nm process, something AMD did successfully with its Ryzen 2000 Series of CPUs. "

AFAIK Ryzen is build on GloFo 12nm ....
 
For reals, you think AMD even still has a chance in low and mid. Can tell you that train has left the station.

nVidia will just dump a 12nm shrink of GP104 on Polaris 30, call it GTX 2030 and be done with it.

Lisa Su has no clue how to run Radeon Group, she hires Raja Koduri to head it up and then just runs him over by crippling VEGA development among other things.

I understand why he just gave Su the finger and moved on to Intel, Intel knows Raja's true worth and just gave him a blank check and said do your thing.
 
For reals, you think AMD even still has a chance in low and mid. Can tell you that train has left the station.

nVidia will just dump a 12nm shrink of GP104 on Polaris 30, call it GTX 2030 and be done with it.

Lisa Su has no clue how to run Radeon Group, she hires Raja Koduri to head it up and then just runs him over by crippling VEGA development among other things.

I understand why he just gave Su the finger and moved on to Intel, Intel knows Raja's true worth and just gave him a blank check and said do your thing.

This. Although I'm not sure about the Raja part. And also not about GP104 performance in the x30 bracket :D more like an x60(ti).
 
"This new silicon is allegedly being built on TSMC's 12 nm process, something AMD did successfully with its Ryzen 2000 Series of CPUs. "

AFAIK Ryzen is build on GloFo 12nm ....

Not to mention in which TSMC 12nm process, I'm quite sure that 12 FFN is for nvidia only. Sure there is 12nm FFC but that is for mobile applications.
 
If a "12nm shrink" of Polaris was financially viable they would have done it long ago...
 
RX 580 8GB already won against the GTX 1060 6GB in the mid section theoretically. It is a bit faster, has more VRAM and its cheaper. Yes, it consumes more power. Most people play maximum of ~3 hours a day. It nearly doesn't affect your electricity bill. What is more important out of the 4? But you have the brainwashed people there saying "I buy Intel and NV" or the sellers only saying customers to buy Intel and NV. Those guys have to be informed about the truth.

You just have to have well optimized games like Forza Horizon 4 and you will see that AMD is way better than it shows in (earlier) NV sponsored titles, where sometimes they are on par with a lower tier NV card. Newer titles like AC: Origins or Shadow of the Tomb Raider show that it may not be this way anymore, as a Vega 56 is on par with a 1070 in NV sponsored games.

What a comment. :D Saying the 2080 is an excellent product just shows how green you are. People DID NOT expect to get a 70-80% jump again as with Pascal, but if you get a little less performance increase than the 700-900 change, where the 900 cards were 50$ cheaper, and now you get the new cards 100-150, and 500-600$ MORE EXPENSIVE, saying these are excellent products shows how biased you are. You are also forgetting that maybe around 2% of the gamers buy the high end GPU of the current gen from NV, which is marginal. Most of the cards come from the 1050Ti and 1060 performance level. And you even had to mention that Intel is the king in CPU. You just forget (again) that most of the gamers use mid range GPUs, from which you don't get a single fps boost with Intel compared to AMD, even not with mid-high 1070... And I can assure you that most 1080Ti owners do not play CS: GO with 720p minimum settings, but use it on Ultrawide or UHD monitors, where the that maximum 10% CPU difference totally disappears. I just laugh at you so loud.
 
Last edited:
Power is not about electricity bill with GPUs. It is about heat and noise.
 
You do realise that these do not exist in a vacuum, right?
Given the same cooler, GPU with more power consumed will emit more heat. This will either translate into more heat and/or faster fan speed on the cooler. More emitted heat will also heat up surrounding air.

And while most readers of TPU likely do have proper air movement in the chassis, many people do not. I have seen many many people who have come to complain about performance issues. When looking at the computer, they slapped a GPU into a random machine in mATX tower than has one fan lazily blowing air in. After about 20 minutes of gaming, temperature inside the case is 50+C. And that gives interesting results.

In general, less heat, less problems.
 
If a "12nm shrink" of Polaris was financially viable they would have done it long ago...

Are they still on making? They already could have moved to GF 12nm process for Polaris, that would not even be a that big change. I seriously don't think that Zen+ would take all the manufacturing capacity from that node.
 
You do realise that these do not exist in a vacuum, right?
Given the same cooler, GPU with more power consumed will emit more heat. This will either translate into more heat and/or faster fan speed on the cooler. More emitted heat will also heat up surrounding air.

And while most readers of TPU likely do have proper air movement in the chassis, many people do not. I have seen many many people who have come to complain about performance issues. When looking at the computer, they slapped a GPU into a random machine in mATX tower than has one fan lazily blowing air in. After about 20 minutes of gaming, temperature inside the case is 50+C. And that gives interesting results.

In general, less heat, less problems.

I don't think I said less heat is not better, I just linked the results that show there are marginal differences. We are not speaking of 390X numbers, where an MSI lightning got 82 degrees with 40 DBa (Techpowerup also). That's what stuffy people should accept. Times are changing, and you shouldn't rate a new product even from the numbers of the previous generation.
So you say most people buy a 1060-1070 with a 10$ chassis? :D
 
So you say most people buy a 1060-1070 with a 10$ chassis? :D
Oh god no. But I have seen enough people who did so a lot of people do and I shudder at the thought. :twitch::D

Check reviews, AIB modell heat difference was less than 10 degrees, and noise wasn't much more either.

Techpowerup MSI Gaming X:
GTX 1060 6GB 67 degrees with 28 DBa
RX 480 8GB 73 degrees with 31 DBa

Guru3D MSI Gaming X:
GTX 1060 6GB 65 degrees with 37 DBa
RX 480 8GB 73 degrees with 38 DBa
These are high end of the selection for these cards. Good cards sporting very good coolers.
I do not know what was up in TPU's RX480 review but 196W for RX480 sounds just wrong. Maybe a bad card.

Guru3D's figures:
Power consumption 168W vs 136W (23% more).
Ambient is probably 22C, lets say 20C. Delta temps 53 vs 45 (17% more).
1 DBa probably means a slightly slower fan speed so that is the rest of the difference.

Btw, noise is a logarithmic scale. 3 DBa louder is about 30% more perceived loudness.
 
at launch how reviewers observed the power via pci-e slot running a little out of spec.
And such launch issues are always going to get by and out, even by even for the biggest companies selling premium products. Does that really keep people on the fence, not that many. In AMD's power issue, sure if you had some OE box with a underwhelming mobo like most, then with some overrate PSU, some of those saw that rear it's ugly head.
https://www.techpowerup.com/247985/...consumption-tested-better-but-not-good-enough

personally, i'm convinced i will be holding on to my 980ti until it blows up.
People buy: A) On price; B) Monitor resolution now and near future; C) If they see or find games that tax the experience/immersion of play.
So, right your 1440p monitor and games you play... you feel are good holding on... While I can't expect you to ante up perhaps another $500 over what 2-1/2 years. Although, there are gamer who are on 280X that are leaning to 1440p and if a $230 card came up they might see the value to purchase.

Steam survey shows the 1060 outpacing it around 5-1. Which is a shame, the 580 is a better card than the 1060 most of the time
So you're saying AMD sold 580's in mass, but those who bought them (miners) weren't showing up on Steam playing games... you don't say? And I'm surprised that for every say 100 - 1060's (all versions ) AMD had 20 folks showing up with a 580, even though all 1060's together and had been on the market over almost a year before a 580 showed and still best's it. Or, does that include the 480, and the 470/570's... Cause if it's just a 580's wow how unfair; it still shows AMD was actually kicking-ass, even while the phenomenal pricing got to from mining... AMD still had 1in5 showing up with a 580, while be cognizant that such prices drove folks to other... How are you exactly extrapolating the data?
 
Last edited:
For reals, you think AMD even still has a chance in low and mid. Can tell you that train has left the station.

nVidia will just dump a 12nm shrink of GP104 on Polaris 30, call it GTX 2030 and be done with it.

Lisa Su has no clue how to run Radeon Group, she hires Raja Koduri to head it up and then just runs him over by crippling VEGA development among other things.

I understand why he just gave Su the finger and moved on to Intel, Intel knows Raja's true worth and just gave him a blank check and said do your thing.
another Nvidia shill without a clue. The sky is falling!
 
Check reviews, AIB modell heat difference was less than 10 degrees, and noise wasn't much more either.

Techpowerup MSI Gaming X:
GTX 1060 6GB 67 degrees with 28 DBa
RX 480 8GB 73 degrees with 31 DBa

Guru3D MSI Gaming X:
GTX 1060 6GB 65 degrees with 37 DBa
RX 480 8GB 73 degrees with 38 DBa
Comparing values like that is just totally pointless.

Sound level measured in dB is logarithmic and relative (dB are used to compare, not to give absolute values).
31 dB vs 28 dB means that RX480 makes 40% more noise (measured in how you feel it) and wastes twice as much power on noise.

Temperature is even worse, because you should be thinking about the distance to ambient level. And when you think about actual impact on your comfort, you should be thinking about heat...
That's why review sites that do this properly report thermal rise, not temperature (and often give very extensive information on how the test setup looks).

The simplest explanation I can give is this: 73*C vs 65*C most likely means something like 50*C vs 42*C of temperature rise. And since the coolers are more or less the same (they move similar quantities of air) it means the RX480 generates A LOT more heat.
 
On the note of Memory Bandwidth: Why would AMD opt for a quad-stack, 4096-bit interface on Vega 20 (This is confirmed as we have seen the chip being displayed), with potentially over 1TB/s of raw memory bandwidth if it wasn't at least somewhat limited by memory bandwidth? Or is that purely due to memory capacity reasons? Honestly almost everyone I talk to about GCN says it is crying out for more bandwidth. It's also worth pointing out that NVIDIA's Delta-Colour Compression is significantly better than AMD's in Vega: 1080 Ti almost certainly has quite a bit more effective bandwidth than Vega when that is factored in.
Well, I was talking about gaming performance, where Vega 64 is outperformed by GTX 1080 which has much less memory bandwidth and computational power. So to make it clear; Vega is not bottlenecked in gaming by memory bandwidth or computational performance.
There are certainly professional workloads where even more memory bandwidth could be useful, which is what Vega 20 is targeting.

So resource utilisation is a major issue for Vega, then. Do you think there is any chance they could have 'fixed' any of this for Vega 20? I won't lie, I've been kinda hoping for some Magical Secret Sauce for Vega 10, perhaps NGG Fast Path or the fabled Primitive Shaders. -shrug- even if it doesn't happen, I am satisfied with my Vega 56 as it is, I am only playing at 1080p, 60 Hz so it is plenty fast enough.
I seriously doubt there will be any major changes in Vega 20, it's mostly a node shrink with full fp64 support and potentially some other hardware for professional use. AMD would need a new architecture to fix GCN, Vega 20 wouldn't do that, and probably Navi will just be a number of tweaks.
 
So, right your 1440p monitor and games you play... you feel are good holding on... While I can't expect you to ante up perhaps another $500 over what 2-1/2 years. Although, there are gamer who are on 280X that are leaning to 1440p and if a $230 card came up they might see the value to purchase.

please don't read too much into a few words i wrote. besides that i do not feel good hanging on, in a 2.5 years period before i bought the 980ti i went from a gtx 570 to a 770, 780ti and then the 980ti. i don't base my purchases on other people but what i want and what i can reasonably pay. and honestly, only one purchase was gaming related.
 
Well, I was talking about gaming performance, where Vega 64 is outperformed by GTX 1080 which has much less memory bandwidth and computational power. So to make it clear; Vega is not bottlenecked in gaming by memory bandwidth or computational performance.
There are certainly professional workloads where even more memory bandwidth could be useful, which is what Vega 20 is targeting.


I seriously doubt there will be any major changes in Vega 20, it's mostly a node shrink with full fp64 support and potentially some other hardware for professional use. AMD would need a new architecture to fix GCN, Vega 20 wouldn't do that, and probably Navi will just be a number of tweaks.

The professional market does require the quad stacked HBM2 and as much as possible memory. There are cards (professional ones) that allow for 1TB of SSD storage to be accessed as it was video memory. https://www.extremetech.com/extreme...ew-ssg-a-gpu-with-1tb-of-ssd-storage-attached

The point with AMD is, it creates card primarily for Pro market, and downsizes this for the consumer market (gaming). Nvidia does the very same. You cant flash Geforce's these days into Quaddro's anymore without performance being crippled compared to a Pro card. And due to this vega is behind. And it took a few driver revisions to get it on par with the 1080 and not 1080Ti.

Vega 56 would be worth better for your money, esp the flashing to 64 bios, overclocking and undervolting. These seem to have very good results as AMD was pretty much rushing those GPU's out without any proper testing about power consumption. The Vega arch on this procede is already maxed out. Anything above 1650Mhz and a full load applied is running towards 350 to 400W terrority. Almost twice as a 1080 and proberly not even 1/3rd performance more.

The refresh on a smaller node is good > it allows AMD to lower power consumption, push for higher clocks and hopefully produce cheaper chips. The smaller you make them the more fit on a silicon wafer. RTX is so damn expensive because those are frankly big dies and big dies take up alot of space on a wafer.

The Polaris was a good mid-range card, and still is. Pubg does excellent at 75Hz/FPS lock at WQHD. In my opinion people dont need 144fps on a 60hz screen. Cap that and you can half your power bill easily. :) Something you dont hear people saying either.
 
The professional market does require the quad stacked HBM2 and as much as possible memory. There are cards (professional ones) that allow for 1TB of SSD storage to be accessed as it was video memory. https://www.extremetech.com/extreme...ew-ssg-a-gpu-with-1tb-of-ssd-storage-attached
That SSG was certainly an interesting product, but was it useful?
I've yet to see an actual use of this card. AFAIK it's not even being offered by OEMs (but I'd love to be surprised).
There's also one other argument for it being pointless - Nvidia ignored the idea. It's not that hard to put an SSD into a Tesla or something.
The point with AMD is, it creates card primarily for Pro market, and downsizes this for the consumer market (gaming).
Yeah... the Pro market disagrees.
AMD may be thinking they're making GPUs for pros, but actually they're still just making powerful chips. A pro product has to offer way more than just performance.
To be honest, I don't understand why this is happening. AMD is just way too big to make such weird mistakes.
I doubt AMD share in enterprise GPU market is larger than in CPU one...

Radeon does make very good custom chips though. So when someone orders a particular setup, large potential of GCN can finally explored. Consoles are great. The Radeon Pro GPUs inside MacBooks are excellent.
Also, the Radeon Pro made for Apple is beautifully efficient unlikely desktop parts (or even other mobile ones).
It shows that the power consumption / heat issues of GCN are a result of either really bad tuning or the quality is really bad (i.e. Apple gets all the GCN that gets near Nvidia's quality).

There's not a single AMD GPGPU accelerated machine on Top500 as well. There's nothing based on EPYC either.
 
Another product or tech who will be worthless after some time. especially mobiles when they break, there is no moneyback.
 
As this has gone off the rails as to AMD having any new mid-range offerings, I'll just ask.

Is there a new product coming... something, or AMD just goes idle for 12-18mo's maintaining with Polaris products as they are?
 
Last edited:
The Polaris was a good mid-range card, and still is. Pubg does excellent at 75Hz/FPS lock at WQHD. In my opinion people dont need 144fps on a 60hz screen. Cap that and you can half your power bill easily. :) Something you dont hear people saying either.

Just... no... you cannot halve your power bill by tweaking settings or buying more efficient hardware, even if you flushed your PC down the toilet (and hence it draws no power at all).
 
Welcome to the hornets nest, you wrote about red/green...now you have to live with a plethora of fanboys, trolls and a few shills.
(good piece, but anything mentioning red/green in a positive fashion WILL get trolled into oblivion by those who subscribe their version of the truth)
That's pretty much exactly what I told him when he wanted to write the article :)
 
Just... no... you cannot halve your power bill by tweaking settings or buying more efficient hardware, even if you flushed your PC down the toilet (and hence it draws no power at all).

Yeah, it's called Frame rate target control. It works really well. And it almost halves the power consumption if you dont need 144Hz on a 75Hz screen. The heat output is considerable lower.
 
Back
Top