• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Vega Microarchitecture Technical Overview

At nominal clockspeed, it should perform at about the 1080. Some games it will get closer to 1070, other's (namely those with Vega optimizations) it will toy with the 1080 Ti. The reason for this isn't necessarily architectural, it's that Pascal is a better fit for TSMC 16nm than Vega is a better fit for GloFo 14nm (Pascal can get crazy high clockspeeds--do more with less). For AMD this is an okay tradeoff because Vega will be easy to turn into Ryzen APUs where the big bucks are for them (huge reason to buy AMD over Intel). Pushing Vega to consoles and APUs is the only way to get developers to implement the Vega optimizations. Vega is as much about software as it is hardware (introduces 40 new instructions).
 
I don't know about that generally the problem is saturating ALL of CU's to get the performance out of it something ASYNC/vulkan partially address's but developers aren't using those

and if you saturate the CU's there goes your power consumption
 
at the end of the day chip uses X power todo Y operation you can change and finess the Y but you can't change X at the architectural level you still need X watts to power Z amount of cu's to render that dickbutt.gif at 30FPS
you can optmise that operation to the moon and back it won't change X
 
at the end of the day chip uses X power todo Y operation you can change and finess the Y but you can't change X at the architectural level you still need X watts to power Z amount of cu's to render that dickbutt.gif at 30FPS
you can optmise that operation to the moon and back it won't change X
Not for nothing but Gamers Nxs showed that the Vega FE don't necessarily need to run at the power targets that was set by AMD to do what it was set to do. In fact this has been the case with AMD gpus for quite some time, they can all be undervolted to perform way better than what AMD has them set to. It is a part of AMDs history, while I do agree with some of points of your argument, I think AMD just set a maximum safe operating input just to get by.
 
More power-saving issues?

Most people react as if they don't know how much power uses refrigerator. Or hair-dryer, for that matter.

So, all these less-than-few power-saving lightbulbs nonsense is just... pointless.

TDP is not 24/7, because like nobody can game that long, all in worst possible scenario, it's still less than 5-10% of what refrigerator does each day. Or other household appliances. Overall "save" is probably <1-2% of total power-bill.

For real enthusiast, this is not that much.

Feed the Kraken, shave the whales - go install a thermal pump or solar-panel or something. Reduce costs, help the planet. GPUs aren't the greatest loss of energy the world ever knew.

We had similar global craze 1-2 years ago, where 30W in maximum CPU power draw was referred as a sin against nature. Hey, CPUs use ~100W for, like, forever. Admittedly, GPU power-usage is increased, because once upon a time they didn't even had additional power-connectors.
 
Back
Top