• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Responds to NVIDIA's GPP: AIB Partners to Announce New Radeon-Exclusive Brands

Joined
Dec 6, 2016
Messages
152 (0.06/day)
System Name The cube
Processor AMD Ryzen 5700g
Motherboard Gigabyte B550M Aorus Elite
Cooling Thermalright ARO-M14
Memory 16GB Corsair Vengeance LPX 3800mhz
Video Card(s) Powercolor Radeon RX 6900XT Red Devil
Storage Kingston 1TB NV2| 2x 1TB 2.5" Hitachi 7200rpm | 2TB 2.5" Toshiba USB 3.0
Display(s) Samsung Odyssey G5 32" + LG 24MP59G 24"
Case Chieftec CI-02B-OP
Audio Device(s) Creative X-Fi Extreme Audio PCI-E (SB1040)
Power Supply Corsair HX1200
Mouse Razer Basilisk X Hyperspeed
Keyboard Razer Ornata Chroma
Software Win10 x64 PRO
Benchmark Scores Mobile: Asus Strix Advantage G713QY | Ryzen 7 5900HX | 16GB Micron 3200MHz CL21 | RX 6800M 12GB |
hot, slow, power hungry.

... my GTX 1080 does 83C and then throttles down. A lot. So it's not a cool card by any means. I replaced the blower style cooler with a NZXT Kraken water cooler, and temps on the GPU went down.. to 75C... that is A LOT for water cooling - worse yet, since the kraken only covers the GPU, the whole card is allmost uniformally heated to 70C, witch makes me worry. I installed vram heatsinks and a custom backplate, and temps went down to 64C on the card and 72 for the GPU - witch is still a lot for a "cool card". As for power usage, the card draws 180-220W when using the stock cooler. It seems nvidia advertised the power usage in non-boost scenaros, as at 1600mhz it does indeed use 180w - but at 1744MHz (it's boost clock at 83C) it goes up to 220w. At 1911MHz (boost clocks on water cooling) it goes up to a staggering 250-260w....

So it's not a cool card, and it has pretty high power consumption - but it is pretty fast, and a good product overall - but I was expecting better.
 

bug

Joined
May 22, 2015
Messages
13,232 (4.06/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
... my GTX 1080 does 83C and then throttles down. A lot. So it's not a cool card by any means. I replaced the blower style cooler with a NZXT Kraken water cooler, and temps on the GPU went down.. to 75C... that is A LOT for water cooling - worse yet, since the kraken only covers the GPU, the whole card is allmost uniformally heated to 70C, witch makes me worry. I installed vram heatsinks and a custom backplate, and temps went down to 64C on the card and 72 for the GPU - witch is still a lot for a "cool card". As for power usage, the card draws 180-220W when using the stock cooler. It seems nvidia advertised the power usage in non-boost scenaros, as at 1600mhz it does indeed use 180w - but at 1744MHz (it's boost clock at 83C) it goes up to 220w. At 1911MHz (boost clocks on water cooling) it goes up to a staggering 250-260w....

So it's not a cool card, and it has pretty high power consumption - but it is pretty fast, and a good product overall - but I was expecting better.
Interestingly enough, what you achieved with water, others have done with air: https://www.techpowerup.com/reviews/Zotac/GeForce_GTX_1080_Mini/33.html
 
Joined
Dec 6, 2005
Messages
10,881 (1.62/day)
Location
Manchester, NH
System Name Senile
Processor I7-4790K@4.8 GHz 24/7
Motherboard MSI Z97-G45 Gaming
Cooling Be Quiet Pure Rock Air
Memory 16GB 4x4 G.Skill CAS9 2133 Sniper
Video Card(s) GIGABYTE Vega 64
Storage Samsung EVO 500GB / 8 Different WDs / QNAP TS-253 8GB NAS with 2x10Tb WD Blue
Display(s) 34" LG 34CB88-P 21:9 Curved UltraWide QHD (3440*1440) *FREE_SYNC*
Case Rosewill
Audio Device(s) Onboard + HD HDMI
Power Supply Corsair HX750
Mouse Logitech G5
Keyboard Corsair Strafe RGB & G610 Orion Red
Software Win 10
If only a manufacturer sold the same panel for both

Acer XF270HU MSRP $599 - Sale $399

Acer XB271HU MSRP $799 - Sale $599

Hmm.. Still a $200 difference

Acer also sells same panels for both FreeSync and G-Sync in 4k 60hz and 34" 21:9 144hz. I'll let you guess on the price difference. Here is a hint, see above



The blind gaming test was eye opening, great find. Subjective and one game, which they point out quite well. Guess you'd need a double blind test with a good cross section of games for really objective data.

On a related subjective note, I've got a Vega 64 and FreeSync monitor... it's the best visual gaming experience I've ever encountered.
 
Joined
Sep 2, 2014
Messages
259 (0.07/day)
Location
Emperor's retreat/Naboo Moenia
System Name Order66
Processor Ryzen 7 3700X
Motherboard Asus TUF GAMING B550-PLUS
Cooling AMD Wraith Prism (BOX-cooler)
Memory 16GB DDR4 Corsair Desktop RAM Vengeance LPX 3200MHz Red
Video Card(s) GeForce RTX 3060Ti
Storage Seagate FireCuda 510 1TB SSD
Display(s) Asus VE228HR
Case Thermaltake Versa C21 RGB
Audio Device(s) onboard Realtek
Power Supply Corsair RM850x
Software Windows10 64bit
... my GTX 1080 does 83C and then throttles down. A lot. So it's not a cool card by any means. I replaced the blower style cooler with a NZXT Kraken water cooler, and temps on the GPU went down.. to 75C... that is A LOT for water cooling - worse yet, since the kraken only covers the GPU, the whole card is allmost uniformally heated to 70C, witch makes me worry. I installed vram heatsinks and a custom backplate, and temps went down to 64C on the card and 72 for the GPU - witch is still a lot for a "cool card". As for power usage, the card draws 180-220W when using the stock cooler. It seems nvidia advertised the power usage in non-boost scenaros, as at 1600mhz it does indeed use 180w - but at 1744MHz (it's boost clock at 83C) it goes up to 220w. At 1911MHz (boost clocks on water cooling) it goes up to a staggering 250-260w....

So it's not a cool card, and it has pretty high power consumption - but it is pretty fast, and a good product overall - but I was expecting better.

I have the Palit GTX1080 Gamerock ( https://www.techpowerup.com/reviews/Palit/GeForce_GTX_1080_GameRock/29.html ) at my brother's pc and never noticed any temperature problems.
As i can see from your system-specs, you have MSI's GTX1080 Aero, ( https://www.newegg.com/Product/Product.aspx?Item=N82E16814127944 ) which bears a reference-like blower design, so it will have a similar performance with the reference one .
If you look the charts i put ,you'll see that the GTX1080reference can indeed reach 83°C . It's never a good idea to prefer a reference cooling design when you can choose a much better one.
 
Joined
Sep 17, 2014
Messages
20,949 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
I have the Palit GTX1080 Gamerock ( https://www.techpowerup.com/reviews/Palit/GeForce_GTX_1080_GameRock/29.html ) at my brother's pc and never noticed any temperature problems.
As i can see from your system-specs, you have MSI's GTX1080 Aero, ( https://www.newegg.com/Product/Product.aspx?Item=N82E16814127944 ) which bears a reference-like blower design, so it will have a similar performance with the reference one .
If you look the charts i put ,you'll see that the GTX1080reference can indeed reach 83°C . It's never a good idea to prefer a reference cooling design when you can choose a much better one.

It doesn't matter. Even in a case with sufficient airflow and normal ambient temps, once you run an overclock, and regardless of cooler, Pascal will hit the thermal ceiling. Its because it was designed this way.

I have a 1080 Gaming X and only at stock will it stay under 80 C - but even then, depending on the game, once this GPU is 100% load and pushing high FPS or using all of its resources, you will see it go to 79-83 C over time. Kepler behaved the same, and all cards in between do it as well. They just boost until they hit that 83 C ceiling and at 79 C they start clocking down or reducing the vcore to stick to that ceiling.

What really matters is what clocks you can push when you're at that temp, not so much the temperature itself. This btw does not make it a hot GPU, in a relative sense and for its performance and power usage Kepler, Maxwell and Pascal run quite cool. But they are also designed to use that headroom to extract higher performance. Most air cooling can already push the clocks up to the limit really, going cooler nets minimal gains. Its just very tightly managed in every sense and it is exactly this refined form of GPU Boost that AMD is so sorely lacking, even today.

Charts & reviews =/= real life and performance under prolonged use and varied loads.

... my GTX 1080 does 83C and then throttles down. A lot. So it's not a cool card by any means. I replaced the blower style cooler with a NZXT Kraken water cooler, and temps on the GPU went down.. to 75C... that is A LOT for water cooling - worse yet, since the kraken only covers the GPU, the whole card is allmost uniformally heated to 70C, witch makes me worry. I installed vram heatsinks and a custom backplate, and temps went down to 64C on the card and 72 for the GPU - witch is still a lot for a "cool card". As for power usage, the card draws 180-220W when using the stock cooler. It seems nvidia advertised the power usage in non-boost scenaros, as at 1600mhz it does indeed use 180w - but at 1744MHz (it's boost clock at 83C) it goes up to 220w. At 1911MHz (boost clocks on water cooling) it goes up to a staggering 250-260w....

So it's not a cool card, and it has pretty high power consumption - but it is pretty fast, and a good product overall - but I was expecting better.

Your clock pre and post water cooling prove my point too. Air gets these cards to max or near-max clock potential already and water is 99% epeen value.
 
Last edited:

bug

Joined
May 22, 2015
Messages
13,232 (4.06/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
@Vayra86 I will add that these days chips (CPUs, GPUs) are designed to run at least at 80C. You can try to lower that if you push the transistors by increasing the voltage, but otherwise lowering the temps is akin to going 60mph on a 75mph road.
What matters when comparing cards is the TDP. That will go onto your electricity bill and determine how much cooling you need.

Edit: I will add that keeping temps in check used to be of paramount importance back when chips couldn't throttle themselves. But that was then.
 
Last edited:
Joined
Jan 29, 2016
Messages
128 (0.04/day)
System Name Ryzen 5800X-PC / RyzenITX (2nd system 5800X stock)
Processor AMD Ryzen 7 5800X (atx) / 5800X itx (soon one pc getting 5800X3D upgrade! ;)
Motherboard Gigabyte X570 AORUS MASTER (ATX) / X570 I Aorus Pro WiFi (ITX)
Cooling AMD Wrath Prism Cooler / Alphenhone Blackridge (ITX)
Memory OLOY 4000Mhz 16GB x 2 (32GB) DDR4 4000 Mhz CL18, (22,22,22,42) 1.40v AT & ITX PC's (2000 Fclk)
Video Card(s) AMD Radeon RX 6800 XT (ATX) /// AMD Radeon RX 6700 XT 12GB GDDR6 (ITX)
Storage (Sys)Sammy 970EVO 500GB & SabrentRocket 4.0+ 2TB (ATX) | SabrentRocket4.0+ 1TB NVMe (ITX)
Display(s) 30" Ultra-Wide 21:9 200Hz/AMD FREESYNC 200hz/144hz LED LCD Montior Connected Via Display Port (x2)
Case Lian Li Lancool II Mesh (ATX) / Velkase Velka 7 (ITX)
Audio Device(s) Realtek HD ALC1220 codec / Onboard HD Audio* (BOTH) w/ EQ settings
Power Supply 850w (Antec High-Current Gamer) HC-850 PSU (80+ gold certified) ATX) /650Watt Thermaltake SFX (ITX)
Mouse Logitech USB Wireless KB & MOUSE (Both Systems)
Keyboard Logitech USB Wireless KB & MOUSE (Both Systems)
VR HMD Oculus Quest 2 - 128GB - Standalone + Oculus link PC
Software Windows 10 Home x64bit 2400 /BOTH SYSTEMS
Benchmark Scores CPUZ - ATX-5800X (ST:670) - (MT: 6836.3 ) CPUZ - ITX -5800X (ST:680.2) - (MT: 7015.2) ??? same CPU?
A lot of fluff but doesn't really saying anything other than "We can't compete in these performance niches so we'll spout platitudes instead ." Since the subject came up, all I can remember is the commercial for Burger King with the old granny muttering "Where's the beef ?". When nVidia came out w/ PhysX, AMD could have a) produced a competing technology or b) licensed it. When nVidia came out w/ G-Sync, they could have a) produced a competing technology or b) licensed it ...instead they chose c) Create a name similar to G-Sync, only provide a part of the technology and sell the lesser featured package at reduced price. AMD could have included a hardware module in the Freesync monitors, but they chose not to ... some Freesync monitor manufacturers did include such a MBR module but they were not well recived because when the Freesync monitors were able to offer motion blur reduction buy adding the necessary hardware, they no longer had that big price advantage... and AMD never jumped on the MBR bandwagon cause they chose instead to sell on price.

nVidia has been taking more and more control from it's board partners legally, driver wise and physically with successive generations. Now it is willing to give 3rd party vendors a boost by partnering with them to create high performance model lines that customers are willing to pay for. We will write our drivers so as to allow higher clocks, if during the install it detects PCBs that meet certain criteria with regard to voltage control, cooling, etc.

And if they do so, all they are saying that if you are using what we give you to increase mindshare and generate high margins, you can't allow our competitor to take advantage of the branding ***we*** helped you build. This is business as usual in America ... newsflash .... America is a capitalist dog eat dog country... deal with it. If you own a pizza joint, Coca Cola will give you a refrigerator to hold its products... you want to put Pepsi in there, you violate the licensing agreement and we take back OUR fridge.

Where's the beef ? If Asus calls the nVidia line Strix and their Radeon line Arez, so what ? If AMD says that Asus can't not sell an AMD based card called Arez, would there be such a steenk ? Buger King can sell a 1/4 pound burger but thye can not call it "the quarter pounder" There is nothing anti-competitive; nothing more sinister limiting the use of the name then there is about not putting our competitor's products in the free fridge we gave you. In the end, all AMD is saying ... "well we gonna offer free fridges too"... and now when we buy pizza, we'll see two fridges ...one with AMD stuff inside and one with nVidia ... great EXACTLY what I wanted ... a way to read the logo on top of the fridge telling me this is where I can find an nVidia product inside and here's where I can find and AMD product inside. Nothing anti competitive, more like truth in advertising. The nVidia Strix products of recent generations are overclocking by 14 - 31%. The AMD cards are in single digits for the most part. The only thing AMD loses by the name limiting partnership agreement is that no one will be purchasing a product thinking that because their nVidia Strix OC's 25%, their AMD Strix is capable of doing the same.

I hope Intel soon does the same as I am frustrated by confused users sending me proposed 8700k builds with X370 MoBos cause they think X370 is a cheaper version of the Z370.
wrong buddy,

Ageia Technologies was the.. "CREATOR" Of PhysX, NOT NVIDIA?????? THEY STRONG ARMED THEM AND ACQUIRED THE TECH FROM AGEIA TECHNOLOGIES?? DO YOUR RESEARCH!!!!!!!!!!!!!!!! the same TEHY DID TO 3DFX BACK IN 2002 REMEMBER 3Dfx, 3Dfx was my favorite 3D accelerator of theat era??? then nvidia Destroyed them, and iv never supported nvidia ever scence..

https://en.wikipedia.org/wiki/Ageia

References[edit]
  1. Jump up^ AGEIA Acquires Meqon Research AB, MOUNTAIN VIEW, Calif. — September 1, 2005
  2. Jump up^ Smalley, Tim. "Nvidia set to acquire Ageia" bit-tech.net, 4 February 2008. Accessed at http://www.bit-tech.net/news/2008/02/04/nvidia_set_to_acquire_ageia/1 on 5 February 2008.
  3. Jump up^ NVIDIA completes Acquisition of AGEIA Technologies, NVIDIA, SANTA CLARA, CA — FEBRUARY 13, 2008 (press-release)
  4. Jump up^ Nvidia finalises Ageia deal, details future plans, Tim Smalley, 14th February 2008, bittech
  5. Jump up^ "Overview". PhysX. GeForce. Retrieved 2 April 2013.

nvida also made it so that even the AGEIA PhysX cards would "NOT" Work with ATI GPUs at that time. neither with ATI or AMD/RTG GPUs either, (STILL,) (TECHNICALLY ATI) but amd owns RTG not (WAS ATI)
 
Joined
Aug 14, 2017
Messages
74 (0.03/day)
1st,amd do ont earn 'gaming' 'ROG'markstheyr lausygpu's, and sure still it want flynviida wings of it.

all start nvidia own trademark and tts slogan 'The Way It's Meant To Be Played',and sureamd want part of it. and when we took amd,always cheat and handicap way...typical also thatcomment...just sympaty licking... huh!

2nd nvidias excellent gpuhelp us to lay 4Kgames for now,without itwe must use amd junk gpus... omggg!

amd,just satrt work..planning,building ,testingand thenrelease goodproduct as customer...for now 3 generation of amd gpus,we get ONLY old tech gpu with different namesnad withLAUSYTERRIBLE EFFICIENCY.

shame amd!
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
16,000 (4.60/day)
Location
Kepler-186f
1st,amd do ont earn 'gaming' 'ROG'markstheyr lausygpu's, and sure still it want flynviida wings of it.

all start nvidia own trademark and tts slogan 'The Way It's Meant To Be Played',and sureamd want part of it. and when we took amd,always cheat and handicap way...typical also thatcomment...just sympaty licking... huh!

2nd nvidias excellent gpuhelp us to lay 4Kgames for now,without itwe must use amd junk gpus... omggg!

amd,just satrt work..planning,building ,testingand thenrelease goodproduct as customer...for now 3 generation of amd gpus,we get ONLY old tech gpu with different namesnad withLAUSYTERRIBLE EFFICIENCY.

shame amd!

Did you have a stroke while writing that? Or is English not your first language? My poor poor eyes...
 
Joined
Dec 6, 2016
Messages
152 (0.06/day)
System Name The cube
Processor AMD Ryzen 5700g
Motherboard Gigabyte B550M Aorus Elite
Cooling Thermalright ARO-M14
Memory 16GB Corsair Vengeance LPX 3800mhz
Video Card(s) Powercolor Radeon RX 6900XT Red Devil
Storage Kingston 1TB NV2| 2x 1TB 2.5" Hitachi 7200rpm | 2TB 2.5" Toshiba USB 3.0
Display(s) Samsung Odyssey G5 32" + LG 24MP59G 24"
Case Chieftec CI-02B-OP
Audio Device(s) Creative X-Fi Extreme Audio PCI-E (SB1040)
Power Supply Corsair HX1200
Mouse Razer Basilisk X Hyperspeed
Keyboard Razer Ornata Chroma
Software Win10 x64 PRO
Benchmark Scores Mobile: Asus Strix Advantage G713QY | Ryzen 7 5900HX | 16GB Micron 3200MHz CL21 | RX 6800M 12GB |
It doesn't matter. Even in a case with sufficient airflow and normal ambient temps, once you run an overclock, and regardless of cooler, Pascal will hit the thermal ceiling. Its because it was designed this way.

I have a 1080 Gaming X and only at stock will it stay under 80 C - but even then, depending on the game, once this GPU is 100% load and pushing high FPS or using all of its resources, you will see it go to 79-83 C over time. Kepler behaved the same, and all cards in between do it as well. They just boost until they hit that 83 C ceiling and at 79 C they start clocking down or reducing the vcore to stick to that ceiling.

What really matters is what clocks you can push when you're at that temp, not so much the temperature itself. This btw does not make it a hot GPU, in a relative sense and for its performance and power usage Kepler, Maxwell and Pascal run quite cool. But they are also designed to use that headroom to extract higher performance. Most air cooling can already push the clocks up to the limit really, going cooler nets minimal gains. Its just very tightly managed in every sense and it is exactly this refined form of GPU Boost that AMD is so sorely lacking, even today.

Charts & reviews =/= real life and performance under prolonged use and varied loads.



Your clock pre and post water cooling prove my point too. Air gets these cards to max or near-max clock potential already and water is 99% epeen value.

You're missing the point - the GTX1080 RUNS HOT. The 1080Ti even more so - and they are very power hungry - yet somehow, people say vega is hot and hungry.
 

bug

Joined
May 22, 2015
Messages
13,232 (4.06/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
You're missing the point - the GTX1080 RUNS HOT. The 1080Ti even more so - and they are very power hungry - yet somehow, people say vega is hot and hungry.
Let's see.
Temps:
1080Ti - 84C https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_Ti/34.html
Vega64 - 85C (when not holding back) https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_64/36.html

Avg power under load:
1080Ti - 231W
Vega64 - 292W

So yeah, Vega isn't hotter then 1080Ti. But it's certainly more power hungry. 60W avg while gaming will show up on your electricity bill. Won't break the bank, but it will be there.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
16,000 (4.60/day)
Location
Kepler-186f
Let's see.
Temps:
1080Ti - 84C https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_Ti/34.html
Vega64 - 85C (when not holding back) https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_64/36.html

Avg power under load:
1080Ti - 231W
Vega64 - 292W

So yeah, Vega isn't hotter then 1080Ti. But it's certainly more power hungry. 60W avg while gaming will show up on your electricity bill. Won't break the bank, but it will be there.


:eek: news to me, my 1080 ti MSI The Duke never breaks 62 celsius and thats at 2060 core time spy extreme stress test, though to be fair i do have a strong fan curve. so I suppose a 3 fan cooler on a strong fan curve is not a fair comparison :D
 

bug

Joined
May 22, 2015
Messages
13,232 (4.06/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
:eek: news to me, my 1080 ti MSI The Duke never breaks 62 celsius and thats at 2060 core time spy extreme stress test, though to be fair i do have a strong fan curve. so I suppose a 3 fan cooler on a strong fan curve is not a fair comparison :D
Yes, custom solutions will work to various degrees of effectiveness, that's why I picked reference designs for comparison. Though at the end of the day, it's still Vega that ouputs more heat.
 
Joined
Oct 22, 2014
Messages
13,210 (3.80/day)
Location
Sunshine Coast
System Name Black Box
Processor Intel Xeon E3-1260L v5
Motherboard MSI E3 KRAIT Gaming v5
Cooling Tt tower + 120mm Tt fan
Memory G.Skill 16GB 3600 C18
Video Card(s) Asus GTX 970 Mini
Storage Kingston A2000 512Gb NVME
Display(s) AOC 24" Freesync 1m.s. 75Hz
Case Corsair 450D High Air Flow.
Audio Device(s) No need.
Power Supply FSP Aurum 650W
Mouse Yes
Keyboard Of course
Software W10 Pro 64 bit
Top