• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Responds to NVIDIA's GPP: AIB Partners to Announce New Radeon-Exclusive Brands

hot, slow, power hungry.

... my GTX 1080 does 83C and then throttles down. A lot. So it's not a cool card by any means. I replaced the blower style cooler with a NZXT Kraken water cooler, and temps on the GPU went down.. to 75C... that is A LOT for water cooling - worse yet, since the kraken only covers the GPU, the whole card is allmost uniformally heated to 70C, witch makes me worry. I installed vram heatsinks and a custom backplate, and temps went down to 64C on the card and 72 for the GPU - witch is still a lot for a "cool card". As for power usage, the card draws 180-220W when using the stock cooler. It seems nvidia advertised the power usage in non-boost scenaros, as at 1600mhz it does indeed use 180w - but at 1744MHz (it's boost clock at 83C) it goes up to 220w. At 1911MHz (boost clocks on water cooling) it goes up to a staggering 250-260w....

So it's not a cool card, and it has pretty high power consumption - but it is pretty fast, and a good product overall - but I was expecting better.
 
... my GTX 1080 does 83C and then throttles down. A lot. So it's not a cool card by any means. I replaced the blower style cooler with a NZXT Kraken water cooler, and temps on the GPU went down.. to 75C... that is A LOT for water cooling - worse yet, since the kraken only covers the GPU, the whole card is allmost uniformally heated to 70C, witch makes me worry. I installed vram heatsinks and a custom backplate, and temps went down to 64C on the card and 72 for the GPU - witch is still a lot for a "cool card". As for power usage, the card draws 180-220W when using the stock cooler. It seems nvidia advertised the power usage in non-boost scenaros, as at 1600mhz it does indeed use 180w - but at 1744MHz (it's boost clock at 83C) it goes up to 220w. At 1911MHz (boost clocks on water cooling) it goes up to a staggering 250-260w....

So it's not a cool card, and it has pretty high power consumption - but it is pretty fast, and a good product overall - but I was expecting better.
Interestingly enough, what you achieved with water, others have done with air: https://www.techpowerup.com/reviews/Zotac/GeForce_GTX_1080_Mini/33.html
 
If only a manufacturer sold the same panel for both

Acer XF270HU MSRP $599 - Sale $399

Acer XB271HU MSRP $799 - Sale $599

Hmm.. Still a $200 difference

Acer also sells same panels for both FreeSync and G-Sync in 4k 60hz and 34" 21:9 144hz. I'll let you guess on the price difference. Here is a hint, see above



The blind gaming test was eye opening, great find. Subjective and one game, which they point out quite well. Guess you'd need a double blind test with a good cross section of games for really objective data.

On a related subjective note, I've got a Vega 64 and FreeSync monitor... it's the best visual gaming experience I've ever encountered.
 
... my GTX 1080 does 83C and then throttles down. A lot. So it's not a cool card by any means. I replaced the blower style cooler with a NZXT Kraken water cooler, and temps on the GPU went down.. to 75C... that is A LOT for water cooling - worse yet, since the kraken only covers the GPU, the whole card is allmost uniformally heated to 70C, witch makes me worry. I installed vram heatsinks and a custom backplate, and temps went down to 64C on the card and 72 for the GPU - witch is still a lot for a "cool card". As for power usage, the card draws 180-220W when using the stock cooler. It seems nvidia advertised the power usage in non-boost scenaros, as at 1600mhz it does indeed use 180w - but at 1744MHz (it's boost clock at 83C) it goes up to 220w. At 1911MHz (boost clocks on water cooling) it goes up to a staggering 250-260w....

So it's not a cool card, and it has pretty high power consumption - but it is pretty fast, and a good product overall - but I was expecting better.

I have the Palit GTX1080 Gamerock ( https://www.techpowerup.com/reviews/Palit/GeForce_GTX_1080_GameRock/29.html ) at my brother's pc and never noticed any temperature problems.
As i can see from your system-specs, you have MSI's GTX1080 Aero, ( https://www.newegg.com/Product/Product.aspx?Item=N82E16814127944 ) which bears a reference-like blower design, so it will have a similar performance with the reference one .
If you look the charts i put ,you'll see that the GTX1080reference can indeed reach 83°C . It's never a good idea to prefer a reference cooling design when you can choose a much better one.
 
I have the Palit GTX1080 Gamerock ( https://www.techpowerup.com/reviews/Palit/GeForce_GTX_1080_GameRock/29.html ) at my brother's pc and never noticed any temperature problems.
As i can see from your system-specs, you have MSI's GTX1080 Aero, ( https://www.newegg.com/Product/Product.aspx?Item=N82E16814127944 ) which bears a reference-like blower design, so it will have a similar performance with the reference one .
If you look the charts i put ,you'll see that the GTX1080reference can indeed reach 83°C . It's never a good idea to prefer a reference cooling design when you can choose a much better one.

It doesn't matter. Even in a case with sufficient airflow and normal ambient temps, once you run an overclock, and regardless of cooler, Pascal will hit the thermal ceiling. Its because it was designed this way.

I have a 1080 Gaming X and only at stock will it stay under 80 C - but even then, depending on the game, once this GPU is 100% load and pushing high FPS or using all of its resources, you will see it go to 79-83 C over time. Kepler behaved the same, and all cards in between do it as well. They just boost until they hit that 83 C ceiling and at 79 C they start clocking down or reducing the vcore to stick to that ceiling.

What really matters is what clocks you can push when you're at that temp, not so much the temperature itself. This btw does not make it a hot GPU, in a relative sense and for its performance and power usage Kepler, Maxwell and Pascal run quite cool. But they are also designed to use that headroom to extract higher performance. Most air cooling can already push the clocks up to the limit really, going cooler nets minimal gains. Its just very tightly managed in every sense and it is exactly this refined form of GPU Boost that AMD is so sorely lacking, even today.

Charts & reviews =/= real life and performance under prolonged use and varied loads.

... my GTX 1080 does 83C and then throttles down. A lot. So it's not a cool card by any means. I replaced the blower style cooler with a NZXT Kraken water cooler, and temps on the GPU went down.. to 75C... that is A LOT for water cooling - worse yet, since the kraken only covers the GPU, the whole card is allmost uniformally heated to 70C, witch makes me worry. I installed vram heatsinks and a custom backplate, and temps went down to 64C on the card and 72 for the GPU - witch is still a lot for a "cool card". As for power usage, the card draws 180-220W when using the stock cooler. It seems nvidia advertised the power usage in non-boost scenaros, as at 1600mhz it does indeed use 180w - but at 1744MHz (it's boost clock at 83C) it goes up to 220w. At 1911MHz (boost clocks on water cooling) it goes up to a staggering 250-260w....

So it's not a cool card, and it has pretty high power consumption - but it is pretty fast, and a good product overall - but I was expecting better.

Your clock pre and post water cooling prove my point too. Air gets these cards to max or near-max clock potential already and water is 99% epeen value.
 
Last edited:
@Vayra86 I will add that these days chips (CPUs, GPUs) are designed to run at least at 80C. You can try to lower that if you push the transistors by increasing the voltage, but otherwise lowering the temps is akin to going 60mph on a 75mph road.
What matters when comparing cards is the TDP. That will go onto your electricity bill and determine how much cooling you need.

Edit: I will add that keeping temps in check used to be of paramount importance back when chips couldn't throttle themselves. But that was then.
 
Last edited:
A lot of fluff but doesn't really saying anything other than "We can't compete in these performance niches so we'll spout platitudes instead ." Since the subject came up, all I can remember is the commercial for Burger King with the old granny muttering "Where's the beef ?". When nVidia came out w/ PhysX, AMD could have a) produced a competing technology or b) licensed it. When nVidia came out w/ G-Sync, they could have a) produced a competing technology or b) licensed it ...instead they chose c) Create a name similar to G-Sync, only provide a part of the technology and sell the lesser featured package at reduced price. AMD could have included a hardware module in the Freesync monitors, but they chose not to ... some Freesync monitor manufacturers did include such a MBR module but they were not well recived because when the Freesync monitors were able to offer motion blur reduction buy adding the necessary hardware, they no longer had that big price advantage... and AMD never jumped on the MBR bandwagon cause they chose instead to sell on price.

nVidia has been taking more and more control from it's board partners legally, driver wise and physically with successive generations. Now it is willing to give 3rd party vendors a boost by partnering with them to create high performance model lines that customers are willing to pay for. We will write our drivers so as to allow higher clocks, if during the install it detects PCBs that meet certain criteria with regard to voltage control, cooling, etc.

And if they do so, all they are saying that if you are using what we give you to increase mindshare and generate high margins, you can't allow our competitor to take advantage of the branding ***we*** helped you build. This is business as usual in America ... newsflash .... America is a capitalist dog eat dog country... deal with it. If you own a pizza joint, Coca Cola will give you a refrigerator to hold its products... you want to put Pepsi in there, you violate the licensing agreement and we take back OUR fridge.

Where's the beef ? If Asus calls the nVidia line Strix and their Radeon line Arez, so what ? If AMD says that Asus can't not sell an AMD based card called Arez, would there be such a steenk ? Buger King can sell a 1/4 pound burger but thye can not call it "the quarter pounder" There is nothing anti-competitive; nothing more sinister limiting the use of the name then there is about not putting our competitor's products in the free fridge we gave you. In the end, all AMD is saying ... "well we gonna offer free fridges too"... and now when we buy pizza, we'll see two fridges ...one with AMD stuff inside and one with nVidia ... great EXACTLY what I wanted ... a way to read the logo on top of the fridge telling me this is where I can find an nVidia product inside and here's where I can find and AMD product inside. Nothing anti competitive, more like truth in advertising. The nVidia Strix products of recent generations are overclocking by 14 - 31%. The AMD cards are in single digits for the most part. The only thing AMD loses by the name limiting partnership agreement is that no one will be purchasing a product thinking that because their nVidia Strix OC's 25%, their AMD Strix is capable of doing the same.

I hope Intel soon does the same as I am frustrated by confused users sending me proposed 8700k builds with X370 MoBos cause they think X370 is a cheaper version of the Z370.
wrong buddy,

Ageia Technologies was the.. "CREATOR" Of PhysX, NOT NVIDIA?????? THEY STRONG ARMED THEM AND ACQUIRED THE TECH FROM AGEIA TECHNOLOGIES?? DO YOUR RESEARCH!!!!!!!!!!!!!!!! the same TEHY DID TO 3DFX BACK IN 2002 REMEMBER 3Dfx, 3Dfx was my favorite 3D accelerator of theat era??? then nvidia Destroyed them, and iv never supported nvidia ever scence..

https://en.wikipedia.org/wiki/Ageia

References[edit]
  1. Jump up^ AGEIA Acquires Meqon Research AB, MOUNTAIN VIEW, Calif. — September 1, 2005
  2. Jump up^ Smalley, Tim. "Nvidia set to acquire Ageia" bit-tech.net, 4 February 2008. Accessed at http://www.bit-tech.net/news/2008/02/04/nvidia_set_to_acquire_ageia/1 on 5 February 2008.
  3. Jump up^ NVIDIA completes Acquisition of AGEIA Technologies, NVIDIA, SANTA CLARA, CA — FEBRUARY 13, 2008 (press-release)
  4. Jump up^ Nvidia finalises Ageia deal, details future plans, Tim Smalley, 14th February 2008, bittech
  5. Jump up^ "Overview". PhysX. GeForce. Retrieved 2 April 2013.

nvida also made it so that even the AGEIA PhysX cards would "NOT" Work with ATI GPUs at that time. neither with ATI or AMD/RTG GPUs either, (STILL,) (TECHNICALLY ATI) but amd owns RTG not (WAS ATI)
 
1st,amd do ont earn 'gaming' 'ROG'markstheyr lausygpu's, and sure still it want flynviida wings of it.

all start nvidia own trademark and tts slogan 'The Way It's Meant To Be Played',and sureamd want part of it. and when we took amd,always cheat and handicap way...typical also thatcomment...just sympaty licking... huh!

2nd nvidias excellent gpuhelp us to lay 4Kgames for now,without itwe must use amd junk gpus... omggg!

amd,just satrt work..planning,building ,testingand thenrelease goodproduct as customer...for now 3 generation of amd gpus,we get ONLY old tech gpu with different namesnad withLAUSYTERRIBLE EFFICIENCY.

shame amd!
 
1st,amd do ont earn 'gaming' 'ROG'markstheyr lausygpu's, and sure still it want flynviida wings of it.

all start nvidia own trademark and tts slogan 'The Way It's Meant To Be Played',and sureamd want part of it. and when we took amd,always cheat and handicap way...typical also thatcomment...just sympaty licking... huh!

2nd nvidias excellent gpuhelp us to lay 4Kgames for now,without itwe must use amd junk gpus... omggg!

amd,just satrt work..planning,building ,testingand thenrelease goodproduct as customer...for now 3 generation of amd gpus,we get ONLY old tech gpu with different namesnad withLAUSYTERRIBLE EFFICIENCY.

shame amd!

Did you have a stroke while writing that? Or is English not your first language? My poor poor eyes...
 
It doesn't matter. Even in a case with sufficient airflow and normal ambient temps, once you run an overclock, and regardless of cooler, Pascal will hit the thermal ceiling. Its because it was designed this way.

I have a 1080 Gaming X and only at stock will it stay under 80 C - but even then, depending on the game, once this GPU is 100% load and pushing high FPS or using all of its resources, you will see it go to 79-83 C over time. Kepler behaved the same, and all cards in between do it as well. They just boost until they hit that 83 C ceiling and at 79 C they start clocking down or reducing the vcore to stick to that ceiling.

What really matters is what clocks you can push when you're at that temp, not so much the temperature itself. This btw does not make it a hot GPU, in a relative sense and for its performance and power usage Kepler, Maxwell and Pascal run quite cool. But they are also designed to use that headroom to extract higher performance. Most air cooling can already push the clocks up to the limit really, going cooler nets minimal gains. Its just very tightly managed in every sense and it is exactly this refined form of GPU Boost that AMD is so sorely lacking, even today.

Charts & reviews =/= real life and performance under prolonged use and varied loads.



Your clock pre and post water cooling prove my point too. Air gets these cards to max or near-max clock potential already and water is 99% epeen value.

You're missing the point - the GTX1080 RUNS HOT. The 1080Ti even more so - and they are very power hungry - yet somehow, people say vega is hot and hungry.
 
You're missing the point - the GTX1080 RUNS HOT. The 1080Ti even more so - and they are very power hungry - yet somehow, people say vega is hot and hungry.
Let's see.
Temps:
1080Ti - 84C https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_Ti/34.html
Vega64 - 85C (when not holding back) https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_64/36.html

Avg power under load:
1080Ti - 231W
Vega64 - 292W

So yeah, Vega isn't hotter then 1080Ti. But it's certainly more power hungry. 60W avg while gaming will show up on your electricity bill. Won't break the bank, but it will be there.
 
Let's see.
Temps:
1080Ti - 84C https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_Ti/34.html
Vega64 - 85C (when not holding back) https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_64/36.html

Avg power under load:
1080Ti - 231W
Vega64 - 292W

So yeah, Vega isn't hotter then 1080Ti. But it's certainly more power hungry. 60W avg while gaming will show up on your electricity bill. Won't break the bank, but it will be there.


:o news to me, my 1080 ti MSI The Duke never breaks 62 celsius and thats at 2060 core time spy extreme stress test, though to be fair i do have a strong fan curve. so I suppose a 3 fan cooler on a strong fan curve is not a fair comparison :D
 
:eek: news to me, my 1080 ti MSI The Duke never breaks 62 celsius and thats at 2060 core time spy extreme stress test, though to be fair i do have a strong fan curve. so I suppose a 3 fan cooler on a strong fan curve is not a fair comparison :D
Yes, custom solutions will work to various degrees of effectiveness, that's why I picked reference designs for comparison. Though at the end of the day, it's still Vega that ouputs more heat.
 
Back
Top