• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3080 Founders Edition

Joined
Feb 13, 2012
Messages
512 (0.16/day)
Which 4K graph has made you think so? RTX 3080 minimum fps is above average for RTX 2080 Ti. I don't remember the last time we've had such a huge leap in performance. How is it "underwhelming"?

Maybe you could share some graphs to prove your PoV?



It's exactly as powerful as advertised if not better at 4K, the resolution it was created for. Never seen so many AMD trolls in one discussion. Also, it costs $700, so not sure about greed. Also, no one forces you to buy it.

Speaking of greed:

perfrel_3840_2160.png

Here you go. 1080ti came out at 700 dollars and performed almost twice as fast as a 980ti that also was 700 dollars. Then 20series came out and almost doubled prices with 2080ti at like 1200 dollars. Everyone is impressed, but in reality Nvidia just shifted back to normal pricing because they are actually expecting competition. What everyone here is concerned about is the fact that 3080 is a cut down version of their biggest gaming chip and is using energy close to the pci thermal limit, and at this stage you start to run into diminishing returns as you scale performance any higher. But in summary, no, 3080 is not a 2080 successor, it is a 2080ti successor which was a cut down version of a Titan.
 
Joined
May 15, 2014
Messages
193 (0.08/day)
@W1zzard, try running FrameView 1.1 (even if you don't have the PCAT kit). It should give detailed data via NVAPI.
 
Joined
Apr 10, 2020
Messages
113 (0.58/day)
Thought about it, decided against it. Old DX11 engine, badly designed, badly optimized, not a proper simulator, small playerbase. When they add DX12 I might reconsider it
I can agree on being badly optimized at the moment, but not on having small player base and not being a proper simulator. Predicted sales are 2.27 million units over the next three years and third party devs will soon offer state of the art super highly detailed planes as addons through MS FS 2020 shop. FS2020 is predestined to become next X-Plane/Prepar3D kind of sim.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
20,960 (3.49/day)
Processor Core i7-4790K
Memory 16 GB
Video Card(s) GTX 1080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 7
Joined
Jan 8, 2017
Messages
5,854 (4.23/day)
System Name Good enough
Processor AMD Ryzen R7 1700X - 4.0 Ghz / 1.350V
Motherboard ASRock B450M Pro4
Cooling Deepcool Gammaxx L240 V2
Memory 16GB - Corsair Vengeance LPX - 3333 Mhz CL16
Video Card(s) OEM Dell GTX 1080 with Kraken G12 + Water 3.0 Performer C
Storage 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) 4K Samsung TV
Case Deepcool Matrexx 70
Power Supply GPS-750C
And that fact that they practically doubled the shader units from the 2080 Ti (not the 2080 Super, mind you) along with 8 more ROPs and RT/Tensor cores. I was realistically expecting this thing to exceed 300W.
That's the catch, the shaders have been doubled but not the amount of SMs. In other words more execution units now share the same control logic/registers/cache which are the real power hogs in a chip.

Nvidia has done this before with Kepler to a more extreme extent, which had six times the amount of shaders per SM versus Fermi. As a result it was one of the most inefficient architectures ever per SM in terms of performance. GK110 was 90% faster than GF110 despite having almost 600% more FP32 units (GF110 shaders did run at a faster clock, still, they are worlds apart).

It's a similar story here, a lot of shading power but it's used rather inefficiently because a lot of resources are shared, that's why the power consumption looks quite bad when you think about it.
 
Last edited:
Joined
Aug 12, 2019
Messages
421 (0.96/day)
System Name Custom
Processor i9 9900k
Motherboard Gigabyte Z390 arous master
Cooling corsair h150i
Memory 4x8 3200mhz corsair
Video Card(s) Asus Strix GTX 1070ti
Storage 500gb Samsung 970 Evo PLus
Display(s) MSi MAG341CQ
Case Lian Li Pc-011 Dynamic
Audio Device(s) Arctis Pro Wireless
Power Supply 850w Seasonic Focus Platinum
Mouse Logitech G403
Keyboard Logitech G110
AMD or intel doesn’t matter! It’s time to upgrade from 10th Gen and game on! It’s almost 3x difference in performance for me from 1070ti :)
 

ZekeSulastin

New Member
Joined
Nov 27, 2019
Messages
2 (0.01/day)
Sorry to bother you, but did you take a look at either reducing the power limit or manually altering the voltage/freq curve in Afterburner/similar? computerbase.de reported only a few percent less performance with a 270 W limit, and someone linked me a video of someone changing that curve to good effect on another site.

It honestly sounds like Turing, where you can't drop the voltage with the slider but can with the curve editor.
 

specopsFI

New Member
Joined
Sep 10, 2019
Messages
12 (0.03/day)
@W1zzard

In the overclocking section, you mention shortly that undervolting is not possible. Can you elaborate on this very important point? Is it prevented on Ampere on a hardware level or is it because the necessary tools are not yet available? The inability to undervolt these extremely power hungry chips would be a serious shortcoming, which I can't believe nVidia would exclude for this exact generation, since they have allowed it for so long with previous generations with access to the whole voltage/clock curve.

Other than that, an excellent article as usual!
 

iO

Joined
Jul 18, 2012
Messages
453 (0.15/day)
Location
Germany
Processor R5 3600
Motherboard MSI B450i Gaming
Cooling Accelero Mono CPU Edition
Memory 16 GB VLP
Video Card(s) RX580 Armor w/ Accelero Mono
Storage P34A80 512GB
Display(s) LG 27UM67 UHD
Case none
Power Supply SS G-650
Great review as usual.

It's weird that they prioritized maximum performance over energy efficiency like in previous gens. By going from 270W to 320W, they sacrificed 15% energy efficiency for just 4% higher performance.
Kinda pulled an AMD there...
 
Joined
Jun 24, 2020
Messages
42 (0.35/day)
2k and 1080p not better enough could be CPU not good enough, could we test again with next Zen 3
 

Cheeseball

Not a Potato
Supporter
Joined
Jan 2, 2009
Messages
1,367 (0.32/day)
System Name Custom AMD Rig
Processor AMD Ryzen™ 7 3800X
Motherboard ASUS TUF GAMING X570-PLUS (WI-FI)
Cooling EVGA CLC 280mm AIO Liquid Cooler
Memory G.SKILL TridentZ 32GB (8GBx4) F4-3200C16-8GTZR
Video Card(s) EVGA GeForce RTX 3080 FTW3 ULTRA GAMING 10GB
Storage 250GB Samsung 970 EVO NVMe, 2TB Inland Premium NVMe, 1TB Crucial MX500 SATA, 4TB WD Blue SATA
Display(s) Acer Nitro XV340CK Pbmiipphzx 34" UWQHD 1440p, LG 27GL650F-B UltraGear 27" 1080p 144 Hz 1ms
Case NZXT H510i Matte White
Audio Device(s) Kanto Audio YU2 and SUB8 Desktop Speakers and Subwoofer, Kingston HyperX Cloud Flight S
Power Supply Corsair RMx Series RM750x 750W
Mouse Kingston HyperX Pulsefire Dart
Keyboard Kingston HyperX Alloy Origins
Software Windows 10 Pro 64-bit 2004
That's the catch, the shaders have been doubled but not the amount of SMs. In other words more execution units now share the same control logic/registers/cache which are the real power hogs in a chip.

Nvidia has done this before with Kepler to a more extreme extent, which had six times the amount of shaders per SM versus Fermi. As a result it was one of the most inefficient architectures ever per SM in terms of performance. GK110 was 90% faster than GF110 despite having almost 600% more FP32 units (GF110 shaders did run at a faster clock, still, they are worlds apart).

It's a similar story here, a lot of shading power but it's used rather inefficiently because a lot of resources are shared, that's why the power consumption looks quite bad when you think about it.
No doubt that doubling the shader units in the same amount of streaming multiprocessors (64/SM Turing vs 128/SM in Ampere) would increase power consumption. If anything, I don't consider it as efficient as Maxwell>Pascal, per se, but I also don't consider it being a complete waste of power as well. I am also considering the fact that those RT and Tensors cores also add to the weight.

All in all, I believe the slight sacrifice to energy efficiency is justified, as long as it doesn't get to Fury X or 290X-levels of wasted power. With this card I can probably play PUBG at 4K 144Hz with competitive settings, which is a mix of medium and low settings with AA maxed out for improved visual clarity, which is important for spotting opponents.

TL;DR - This card is overkill for those still gaming in 1080p between 1440p. If you're aiming at UW (3440x)1440p or 4K, this seems to hit the sweet spot.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
20,960 (3.49/day)
Processor Core i7-4790K
Memory 16 GB
Video Card(s) GTX 1080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 7
Joined
May 15, 2014
Messages
193 (0.08/day)
I don't see why the power consumption is that surprising if you look at the specs. Sure its on 8nm now, but thats coming from 12nm.
Nitpick, TSMC 16FF (12N) -> Samsung 10LPU (8N). 320W+ is a 30% increase in power for ~30% increase in perf.

No doubt that doubling the shader units in the same amount of streaming multiprocessors (64/SM Turing vs 128/SM in Ampere) would increase power consumption. If anything, I don't consider it as efficient as Maxwell>Pascal, per se, but I also don't consider it being a complete waste of power as well. I am also considering the fact that those RT and Tensors cores also add to the weight.
Means little when it's dark silicon. Largely useful for compute/CUDA workloads & @ 4k where each frame becomes more alu limited. The increased number & revised ROP partitions help @ 4k.

We may be doing GA a disservice given RT & denoising should show gains when games/compiler are better optimised.

GPU-Z already shows the same data, using NVAPI, too
With a nice in-game overlay? :)
 
Last edited:
Joined
Aug 5, 2020
Messages
134 (1.72/day)
System Name BUBSTER
Processor I7 8700K
Motherboard ASUS Maximus Hero X
Cooling Deepcool Gamer Storm Captain 240 EX
Memory 32GB G.Skill Trident Z RGB LED DDR4 3866MHz 4x8GB
Video Card(s) SLI 2 x Asus GeForce GTX 1080 ROG Strix Gaming Advanced
Storage Samsung 970 EVO M2 NVME + 2 Tb RAID 0 Samsung 850 EVO 500GB X4 + 12 TB HDD
Display(s) Sony Bravia KD-55X8505C
Case Corsair Carbide Air 540
Audio Device(s) Asus Xonar Essence STX II
Power Supply Corsair AX 860i
Mouse Corsair Gaming M65 Pro RGB + Razr Taipan
Keyboard Asus ROG Strix Flare Cherry MX Red + Corsair Gaming K65 lux RGB
Software Windows 10 Pro x64
Benchmark Scores overall: 13177 Graphics 3d Mark : 15523
I already have a 1080 SLI 3 years ago and it's great for 4k experience minimum 70 - 80 fps in Ultra in most games...
 
Joined
Apr 29, 2014
Messages
3,895 (1.64/day)
Location
Texas
System Name SnowFire / The Reinforcer / New Portable?
Processor i7 10700K 5.0ghz (24/7) / 2x Xeon E52650v2 / AMD Ryzen 5 3600X
Motherboard Asus Strix Z490 / Dell Dual Socket (R720) / Asrock X570 ITX
Cooling RX 360mm + 140mm Custom Loop / Dell Stock / Noctua L9i (Yes L9i)
Memory Corsair RGB 16gb DDR4 3000 CL 16 / DDR3 128gb 16 x 8gb / Corsair RGB 3200 16gb
Video Card(s) GTX Titan XP (2025mhz) / Asus GTX 950 (No Power Connector) / GTX 970 (Temp)
Storage Samsung 970 1tb NVME and 2tb HDD x4 RAID 5 / 300gb x8 RAID 5 / 2x Samsung 850 Pro 512gb
Display(s) Acer XG270HU, Samsung G7 Odyssey (1440p 240hz) / HP Omen 1080p 240hz
Case Thermaltake Cube / Dell Poweredge R720 Rack Mount Case / Fractal Design Node 202
Audio Device(s) Realtec ALC1150 (On board)
Power Supply Rosewill Lightning 1300Watt / Dell Stock 750 / Brick / Fractal Design 450 Watt Bronze
Mouse Logitech G5
Keyboard Logitech G19S
Software Windows 10 Pro / Windows Server 2016 / Windows 10 Pro
Great review, I cannot wait for the RTX 3090 review!

Its interesting how this cooler performs, seems to be a nice improvement over past "FE" coolers in a meaningful way especially with the power consumption of these cards. I am a little disappointed in the overclocking of these cards, I mean overclocking has been meh for awhile but it seems like this one is even less than normal. Granted the memory moved up and it shows some decent performance gains but it looks like these cards are already pushed to their limit out of the box with only minor improvement with aftermarket cards.

Still cant wait!
 

ppn

Joined
Aug 18, 2015
Messages
910 (0.48/day)
The important increase is in transistors. 28000(5000 broken/disabled) is 70% more than 2080 and 25% more than what 2080Ti had. 25% is the same as the performance increase. now, on 7nm DUV this is a 426mm2 die, instead of 628mm2. the main reason to avoid this is that shrinks are imminent at this point. this we didn't get, sadly. on 6nm EUV this is 360mm2 and clock speed +50% for same power. so this is just another titan. big powerful but will fall down inevitably. sometimes in less than 10-12 months. so yeah $700 not as good as you think. except in the moment, in the moment is everything.
 

Cheeseball

Not a Potato
Supporter
Joined
Jan 2, 2009
Messages
1,367 (0.32/day)
System Name Custom AMD Rig
Processor AMD Ryzen™ 7 3800X
Motherboard ASUS TUF GAMING X570-PLUS (WI-FI)
Cooling EVGA CLC 280mm AIO Liquid Cooler
Memory G.SKILL TridentZ 32GB (8GBx4) F4-3200C16-8GTZR
Video Card(s) EVGA GeForce RTX 3080 FTW3 ULTRA GAMING 10GB
Storage 250GB Samsung 970 EVO NVMe, 2TB Inland Premium NVMe, 1TB Crucial MX500 SATA, 4TB WD Blue SATA
Display(s) Acer Nitro XV340CK Pbmiipphzx 34" UWQHD 1440p, LG 27GL650F-B UltraGear 27" 1080p 144 Hz 1ms
Case NZXT H510i Matte White
Audio Device(s) Kanto Audio YU2 and SUB8 Desktop Speakers and Subwoofer, Kingston HyperX Cloud Flight S
Power Supply Corsair RMx Series RM750x 750W
Mouse Kingston HyperX Pulsefire Dart
Keyboard Kingston HyperX Alloy Origins
Software Windows 10 Pro 64-bit 2004
Nitpick, TSMC 16FF (12N) -> Samsung 10LPU (8N). 320W+ is a 30% increase in power for ~30% increase in perf.



Means little when it's dark silicon. Largely useful for compute/CUDA loads & @ 4k where each frame becomes more alu limited. The increased number & revised ROP partitions help @ 4k.

We may be doing GA a disservice given RT & denoising should show gains when games/compiler are better optimised.



With a nice in-game overlay? :)
Hmm.. I know the TDP of the 2080 Ti is at 250W, but you can see it using around 273W (average gaming) according to @W1zzard's charts. The 3080 is rated at 320W, but it does seem to be hovering around 303W. Maybe NVIDIA just overshooting their stated specs?
 
Joined
Mar 31, 2012
Messages
781 (0.25/day)
Location
NL
System Name SIGSEGV
Processor INTEL i7-7700K | AMD Ryzen 2700X
Motherboard QUANTA | ASUS Crosshair VII Hero
Cooling Air cooling 4 heatpipes | Corsair H115i | Noctua NF-A14 IndustrialPPC Fan 3000RPM
Memory Micron 16 Gb DDR4 2400 | GSkill Ripjaws 32Gb DDR4 3200 3400(OC) 14-14-14-34 @1.38v
Video Card(s) Nvidia 1060 6GB | Gigabyte 1080Ti Aorus
Storage 1TB 7200/256 SSD PCIE | ~ TB | 970 Evo
Display(s) 15,5" / 27"
Case Black & Grey | Phanteks P400S
Audio Device(s) Realtek
Power Supply Li Battery | Seasonic Focus Gold 750W
Mouse g402
Keyboard Leopold|Ducky
Software LinuxMint KDE |UBUNTU | Windows 10 PRO
Benchmark Scores i dont care about scores
I hate to say, but it's kinda weird with all those charts. Hype all the way. LOL
unimpressive performance. feels like they release this card in hurry. :oops:

nice review. thanks.
 
Joined
Apr 10, 2020
Messages
113 (0.58/day)
Hardware unboxed numbers (similar results):

RTX 3080 vs 2080 Ti: 14 Game Average at 1440p = +21 %
RTX 3080 vs 2080 Ti: 14 Game Average at 4K = +31 %

RTX 3080 vs RTX 2080: 14 Game Average at 1440p = +47%
RTX 3080 vs RTX 2080: 14 Game Average at 4K = +68%

Very nice gains at 4K and average generational gain at 1440p (excluding Turing)...


Total (off the wall) system power consumption: 523W
GPU only measured on Nvidia’s PCAT (Power Capture Analysis Tool) playing DOOM: 327W
8% performance per watt gain over Turing -> far from impressive given Ampere moved to new node
 
Joined
Aug 2, 2012
Messages
682 (0.23/day)
Location
Netherlands
System Name TheDeeGee's PC
Processor Intel Core i7 4770K
Motherboard Gigabyte Z87X-UD5H
Cooling Noctua NH-D15S
Memory Crucial Ballistix Tactical LP 16GB
Video Card(s) MSI GTX 1070 AERO OC (Custom Cooler)
Storage Crucial BX500 1TB, 2x Western Digital 2TB 2,5"
Display(s) EIZO CX240
Case Antec P280
Audio Device(s) Creative SoundBlaster ZxR
Power Supply Seasonic Platinum 760
Mouse Logitech G500s
Keyboard Logitech G710+
Software Windows 10 Pro 64-Bit
The fan noise levels are enough for me to pass on a FE.

Looks like i will have to find a partner board that fit my Arctic Accelero Xtreme 3.
 
Joined
Dec 31, 2009
Messages
18,683 (4.73/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
Here you go. 1080ti came out at 700 dollars and performed almost twice as fast as a 980ti that also was 700 dollars
Math my man. :(

that is 46% faster. You realize that 2x = 100% right? For example if card A ran at 100 FPS and card B ran at 146 FPS, card B is 46% faster than card A. If it was "double" it would be 100%.
 
Joined
Oct 17, 2012
Messages
9,247 (3.16/day)
Location
Massachusetts
System Name Americas cure is the death of Social Justice & Political Correctness
Processor i5 8600k
Motherboard Asrock Z370 Extreme 4
Cooling Corsair H-110i GTX
Memory 2x 4Gb Crucial Sport LT
Video Card(s) MSI GTX 980 Gaming
Storage Samsung 850 evo 250Gb
Display(s) Dell Ultra Sharp Widescreen 24" 1200P
Case Fractal Design Meshify-C
Power Supply Seasonic Focus+ 750 Gold
Mouse Logitech G502 spectrum
Keyboard AZIO MGK-1 RGB (Kaith Blue)
Software Win 10 Professional 64 bit
it didnt seem to have that damn adhesive that you need to heat up to access fasteners like the 1xxx Reference cards atleast. unless i missed the picture with that.

the 9xx Reference had those damn plastic type hex screws that stripped if you coughed near them
 

Raevenlord

News Editor
Staff member
Joined
Aug 12, 2016
Messages
3,240 (2.11/day)
Location
Portugal
System Name The Ryzening
Processor AMD Ryzen 7 1700
Motherboard MSI X370 Gaming Pro Carbon
Cooling Arctic Cooling Liquid Freezer 120
Memory 16 GB G.Skill Trident Z F4-3200 (2x 8 GB)
Video Card(s) TPU's Awesome MSI GTX 1070 Gaming X
Storage Boot: Crucial MX100 128GB; Gaming: Crucial MX 300 525GB; Storage: Samsung 1TB HDD, Toshiba 2TB HDD
Display(s) Acer Nitro VG270UP (1440p 144 Hz IPS)
Case NOX Hummer MC Black
Audio Device(s) iFi Audio Zen DAC
Power Supply Seasonic Focus+ 750 W
Mouse Cooler Master Masterkeys Lite L
Keyboard Cooler Master Masterkeys Lite L
Software Windows 10 x64
At 60Hz, yes.

My 1440p G-Sync monitor goes upto 165Hz, I wouldn't say overkill for that, it depends on the game.

True. Mine goes up to 144, so actually looking at the figures, it may make sense for it. Especially with future-proofing concerns. I suppose it depends mostly on cyberpunk 2077's performance, though. Luckily, I have time until we have some information from the competition.
 

Cheeseball

Not a Potato
Supporter
Joined
Jan 2, 2009
Messages
1,367 (0.32/day)
System Name Custom AMD Rig
Processor AMD Ryzen™ 7 3800X
Motherboard ASUS TUF GAMING X570-PLUS (WI-FI)
Cooling EVGA CLC 280mm AIO Liquid Cooler
Memory G.SKILL TridentZ 32GB (8GBx4) F4-3200C16-8GTZR
Video Card(s) EVGA GeForce RTX 3080 FTW3 ULTRA GAMING 10GB
Storage 250GB Samsung 970 EVO NVMe, 2TB Inland Premium NVMe, 1TB Crucial MX500 SATA, 4TB WD Blue SATA
Display(s) Acer Nitro XV340CK Pbmiipphzx 34" UWQHD 1440p, LG 27GL650F-B UltraGear 27" 1080p 144 Hz 1ms
Case NZXT H510i Matte White
Audio Device(s) Kanto Audio YU2 and SUB8 Desktop Speakers and Subwoofer, Kingston HyperX Cloud Flight S
Power Supply Corsair RMx Series RM750x 750W
Mouse Kingston HyperX Pulsefire Dart
Keyboard Kingston HyperX Alloy Origins
Software Windows 10 Pro 64-bit 2004
True. Mine goes up to 144, so actually looking at the figures, it may make sense for it. Especially with future-proofing concerns. I suppose it depends mostly on cyberpunk 2077's performance, though. Luckily, I have time until we have some information from the competition.
I'm gonna throw my guess out there. This RTX 3080 is probably around 100 FPS in Cyberpunk 2077 at 4K. This is based off any improvements from the Witcher 3 engine and the fact that they are still optimizing it for the current gen (PS4/XB1) consoles.
 
Joined
Nov 11, 2016
Messages
544 (0.38/day)
System Name The de-ploughminator
Processor I7 8700K @ 5.1Ghz
Motherboard Gigabyte Z370 Gaming 5
Cooling Custom Watercooling
Memory 4x8GB G.Skill Trident Neo 3600mhz 15-15-15-30
Video Card(s) RTX 2080 Ti + Heatkiller IV wb
Storage Plextor 512GB nvme SSD
Display(s) LG 34GN850-B
Case Lian Li 011D Dynamic
Audio Device(s) Creative AE-5
Power Supply Corsair RM1000
Hmm.. I know the TDP of the 2080 Ti is at 250W, but you can see it using around 273W (average gaming) according to @W1zzard's charts. The 3080 is rated at 320W, but it does seem to be hovering around 303W. Maybe NVIDIA just overshooting their stated specs?
It's simple to explain
2080 Ti FE 260W TDP put the chip in the lower region (more efficient) part of the perf/power curve while 3080 320W TG is at the higher point in the perf/power curve
Meaning it's easy to overclock the 2080 Ti by simply rasing the power limit while raising power limit on 3080 does nothing (as in every review have pointed out, very similar to 5700XT).
That also means lowering the TGP of the 3080 to similar level to 2080 Ti like Computerbase did will not lower the performance of 3080 by much, improving 3080 efficiency if you so require.

 
Top