• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3080 Founders Edition

Joined
Apr 21, 2010
Messages
427 (0.10/day)
System Name Home PC
Processor Ryzen 1600X
Motherboard Asus Prime X370 Pro
Cooling Thermaltake Contac Silent 12
Memory Dual channel G skill F4-3200C16-8GVKB
Video Card(s) XFX RX480 GTR - XFX Double Dissipation R9 290
Storage Samsung SSD Evo 120 - Adata SU80 256 - Adata SX6000 Lite 512 GB
Display(s) AoC 931wx (19in, 1680x1050)
Case Green Magnum Evo
Power Supply Green 650UK Plus
Mouse Trash A4tech OP-620D
Keyboard Old 12 years FOCUS FK-8100
If you go 1080p , you can not go back to 720p
If you go 1440p , you can not go back to 1080p
If you go 2160p , you can not go back to 1440p

That's being said , Your eyes is excited to new world but will hurt in front of old world, so for next generation card , make sure you have at least enough budget , you can't keep up with current generation if you're at high tier.4K is no joke.
 
Last edited:
Joined
Jan 17, 2018
Messages
119 (0.09/day)
Processor Ryzen 7 1700 @ 3.7ghz
Motherboard Asrock Taichi x370
Cooling Stock
Memory 32gb Trident Z @ 2933
Video Card(s) Sapphire Nitro+ Vega 64
Storage 256 Sandisk Pro SSD, 1TB Mushkin Reactor SSD, 5TB WD Black HDD
Display(s) Asus MG279Q
Case Phanteks Ethoo Evolv ATX
Power Supply Seasonic SS-760XP2
The RTG hype train has been excellent at overpromising until we get reviews and then the hype train derails hard. And the "let's wait" game never fails to disappoint. Too bad most people still prefer overpriced hot garbage with gimmicky tech from NVIDIA as according to Steam HW stats, the overhyped God Lisa Su bestowed RX 5000 series fails to register in the charts standing at 1.13% which is below a single card from the GeForce 20 RTX series, which is the RTX 2060 SUPER at 1.25%. It's amazing and extremely puzzling how and why the comments sections of major tech websites are flooded with AMD fans. Brand loyalty is effing stupid.

And "leaks" like this one is just disgusting: https://videocardz.com/newz/amd-radeon-rx-6900xt-leaks-in-new-photos If AMD had been confident in Big Navi they wouldn't have tried to steal the thunder from this NVIDIA release.
We get it, you don't like AMD and Radeon. You've replied and posted in this thread multiple times ranting about the AMD fanboys that barely exist in this thread.

AMD GPU's are a month and a half off, it's not unreasonable for people to wait and see what performance Big NAVI & RDNA2 brings to the table before spending $700+ on a GPU.
 
Joined
Mar 10, 2010
Messages
9,230 (2.19/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R7 3800X@4.350/525/ Intel 8750H
Motherboard Crosshair hero7 @bios 2703/?
Cooling 360EK extreme rad+ 360$EK slim all push, cpu Monoblock Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 32Gb in four sticks./16Gb
Video Card(s) Sapphire refference Rx vega 64 EK waterblocked/Rtx 2060
Storage Silicon power qlc nvmex3 in raid 0/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd
Display(s) Samsung UAE28"850R 4k freesync.
Case Lianli p0-11 dynamic
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
Joined
Jul 27, 2020
Messages
39 (0.09/day)
I have no brand preference right now I have a 2700x with a 5700xt with a 2k 144hz freesync monitor that does around 60 to 70 on badass settings in borderlands 3 and it does what I need it to do. But I also have the part of me that says get this but I'm trying to drown that out because I have what I need but I want it!
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
22,848 (3.60/day)
Processor Core i7-8700K
Memory 32 GB
Video Card(s) RTX 3080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
6 I'll take 1 exception to the review conclusions "Makes little sense for gamers without a 4K monitor ". To date, I find that I more often enjoy playing better at 1440p w/ ULMB than playing at 4k with Sync. The 3080 managed to hit 120 fps in just under half the games. Won't be investing in a 4k monitor until I get to see it do 120 Hz w/ ULMB. The 3080 would drive it to 120 Hz in a bit less than 50% of games and that's enough to justify the purchase.
Fair point, I updated the entry to "Makes little sense for gamers without a 4K monitor, or 1440p 144+ Hz"
 
Joined
Jan 24, 2008
Messages
882 (0.18/day)
System Name Meshify C Ryzen 2019
Processor AMD Ryzen 3900X
Motherboard X470 AORUS ULTRA GAMING
Cooling AMD Wraith Prism LED Cooler
Memory 32GB DDR4 ( F4-3200C16D-32GTZKW, 16-16-16-36 @ 3200Mhz )
Video Card(s) AMD Radeon RX6800 ( 2400Mhz/2150Mhz )
Storage Samsung Evo 960
Display(s) Pixio PX275h
Case Fractal Design Meshify C – Dark TG
Audio Device(s) Sennheiser GSP 300 ( Headset )
Power Supply Seasonic FOCUS Plus Series 650W
Mouse Logitech G502
Keyboard Logitech G 15
Software Windows 10 Pro 64bit
Thank you so much for testing with VII and testing of BFV. All other reviewers stopped testing Frostbite engine and for Battlefield players, that benchmark is very important.
But since the 3080 is only 27% faster than my VII in BFV, I'm gonna wait for the Big Navi benchmarks and see if it still performs much better then the Nvidia counterparts.
For some reason, frostbite loves AMD cards.
 

md2003

New Member
Joined
Jul 31, 2020
Messages
22 (0.05/day)
Thank you so much for testing with VII and testing of BFV. All other reviewers stopped testing Frostbite engine and for Battlefield players, that benchmark is very important.
But since the 3080 is only 27% faster than my VII in BFV, I'm gonna wait for the Big Navi benchmarks and see if it still performs much better then the Nvidia counterparts.
For some reason, frostbite loves AMD cards.
27% on 1080p do you mean? If i am not totally wrong here, BFV is capped at 200FPS.
 
Joined
Sep 26, 2009
Messages
78 (0.02/day)
Location
Puerto Rico
Thank you @W1zzard for an excellent review , job well done !

Boss i got a couple of questions , so in theory overclocking headroom is due to all the power limits , temps and low volts 1.081v , do you think with v.mod like shunt mods i have done on many cards will help with overclocking headroom , planning to run this on water chiller at -21c for hwbot competitive benchmarking.
 
Joined
Sep 1, 2009
Messages
977 (0.22/day)
Location
CO
System Name Rebirth
Processor Intel i5 2500k @4.5Ghz
Motherboard Asus P8P67 Pro
Cooling Megahalem 120x25 x2 GT AP-15 Push/Pull
Memory 2x4Gb Corsair Vengeance
Video Card(s) Sapphire HD7950 Vapor-X + MSI HD7950 TF3
Storage Samsung 840 Pro 120 SSD + Seagate 7200.12 1TB + 500gig WD + 3TB Hitachi
Display(s) X-Star Glossy DP2710
Case Antec 1200
Audio Device(s) Asus Xonar STX
Power Supply Antec CP-850
Software Microsoft Windows 8 Pro x64
I think i am going to wait for RDNA2 before i make a decision.
 
Joined
Jun 26, 2018
Messages
20 (0.02/day)
Location
USA
System Name Cougar Conquer Custom
Processor Intel i9-9900k@5 GHz
Motherboard Gigabyte Aorus Master Z390
Cooling EKWB Custom Hardline
Memory 32 GB Corsair Vengeance
Video Card(s) ASUS Strix 3080 (soon)
Storage 1 TB NVME, 500 GB NVME
Display(s) BenQ Zowie XL2746S
Case Cougar Conquer
Power Supply Corsair AX1200i
Mouse Logitech GPW
Keyboard Fnatic Ministreak
Fair point, I updated the entry to "Makes little sense for gamers without a 4K monitor, or 1440p 144+ Hz"

Any chance you will test 240 Hz/1080p with popular BR games (e.g. Warzone)? I know reproducing a test can be hard in a BR but there are 75 million of us out there which is a fair bit more than the SP games tested on this site and others.

If you go 1080p , you can not go back to 720p
If you go 1440p , you can not go back to 1080p
If you go 2160p , you can not go back to 1440p

That's being said , Your eyes is excited to new world but will hurt in front of old world, so for next generation card , make sure you have at least enough budget , you can't keep up with current generation if you're at high tier.4K is no joke.

Not true, I went back to 1080p/240 Hz from 1440p/144 Hz because the image quality improvement vs performance loss wasn't worth it.
 
Joined
Oct 8, 2005
Messages
234 (0.04/day)
Location
Amsterdam
System Name Gamer
Processor Intel i9 9900K@4.9Ghz Offset voltage
Motherboard Asus Rog Strix Z390-F
Cooling Noctua NHD15S
Memory Corsair Vengeance LPX 2x8GB 3200Mhz C16
Video Card(s) Asus TUF RTX3080 oc
Storage 3x Seagate Barracuda 2TB, 512GB SSD Samsung 860evo/1TB nvme Samsung 970evo
Display(s) Asus MG278Q
Case CM Mastercase Pro 5
Audio Device(s) Onboard
Power Supply Coolermaster V850 - 850Watt
Mouse Logitech Hyperion G402
Keyboard Corsair K55
Software Windows10 Pro
Benchmark Scores Firestrike Ultra: 11444 - Timespy: 18246
You get used to the resolution quite easy, even going back. Although I wouldn't want to go back from 1440p currently. I tried a 4k 28" monitor for a while, but frames were too low. Also I had to enlarge fonts in windows desktop, or else they were simply too small for my taste. A larger 4k screen would mitigate that though.
Graphically it looked very crisp , even the desktop. But my current 1440p monitor is hopefully gonna serve me for many years. 144Hz is pretty smooth if you get the fps also.
 

EzaaaH

New Member
Joined
Sep 16, 2020
Messages
1 (0.00/day)
@W1zzard

In the overclocking section, you mention shortly that undervolting is not possible. Can you elaborate on this very important point? Is it prevented on Ampere on a hardware level or is it because the necessary tools are not yet available? The inability to undervolt these extremely power hungry chips would be a serious shortcoming, which I can't believe nVidia would exclude for this exact generation, since they have allowed it for so long with previous generations with access to the whole voltage/clock curve.

Other than that, an excellent article as usual!
It seems the primary issue is the pre-release drivers and lack of support in software like Afterburner. Other sites have indicated even the OC Scanner tool doesn't work yet.

This will MOST LIKELY change moving forward. There's no reason to believe Nvidia would now decide to lock out these features. It would give them unnecessary bad PR and lost credibility with enthusiasts.

I don't like how this article said it though. I could be wrong but I'd be very surprised if this was the case.
 
Joined
Jun 10, 2014
Messages
2,497 (0.94/day)
I don't think anyone is wondering whether or not a game is going to need more than 10 GB any time soon because they wont. Running them at 4K with everything maxed out will, most definitely cause >10GB usage quite soon.
That's a strange way to think about it, if it is possible to exceed that amount then it inevitably becomes a limitation.
Proper benchmarks will be the proper way to test whether games actually needs more than a certain amount of VRAM.
Unless we are talking about custom mods which may cause abnormal VRAM usage, it's very unlikely that 10 GB will be a bottleneck for this card anytime soon. Nvidia have a very good idea of big titles coming in the next year or so, if this was going to be a problem they wouldn't release it with 10GB.

The reality is though, the way games commonly use VRAM also needs more bandwidth and computational performance to utilize more VRAM in a single frame. So by the time games needs >10 GB to run top games in 4K, this card will not be fast enough anyway. Many have made bold predictions about this kind of stuff for years, but unless games starts to work in a different way, I would expect other bottlenecks to occur before capacity problems (not accounting for the odd edge case of course).

You would be surprised (or horrified?) if you knew how many business critical processes are implemented using "primitive" tools like Excel and VBA. But even though it might not be an elegant solution, it's often good enough and it fits internal procedures very well, so often it's actually much better than something some highly paid consultants would come up with. Many programmers make fun of this kind of stuff, but sometimes these improvised tools can be very low maintenance and work very well for many people.
 

ppn

Joined
Aug 18, 2015
Messages
1,005 (0.45/day)
On top of that the more RTX becomes prevalent the less memory buffer is needed. SInce who needs textures anymore. when everything is done with rays or something like that so i believe.

So in order to be 31% faster than 2080Ti with bandwidth that is only 23% better, means the GPU must be 39% better and that corresponds to 6144 Cuda operating as FP32 and 2560 Cuda as Integer in games on average. 3070 must be 4352 fp32 1536 Integer lagging behind 2080Ti by 16% for the 448GBs memory the decision not to use GDDR6X 606GBs is a great loss. i like the idea of 3060Ti Asap.
 
Joined
Jun 10, 2014
Messages
2,497 (0.94/day)
On top of that the more RTX becomes prevalent the less memory buffer is needed. SInce who needs textures anymore. when everything is done with rays or something like that so i believe.
Textures will not become less useful with raytracing, quite the contrary, as more textures will be needed to represent the various properties of materials.

But raytracing may reduce the need for many render passes, temporary framebuffers etc. used in various advanced lighting techniques.

So in order to be 31% faster than 2080Ti with bandwidth that is only 23% better, means the GPU must be 39% better and that corresponds to 6144 Cuda operating as FP32 and 2560 Cuda as Integer in games on average. 3070 must be 4352 fp32 1536 Integer lagging behind 2080Ti by 16% for the 448GBs memory the decision not to use GDDR6X 606GBs is a great loss. i like the idea of 3060Ti Asap.
Don't get too caught up in theoretical specs, good benchmarking is what actually matters.
Ampere is a major architectural overhaul, including caches etc. which means you can't extrapolate the performance based on older architectures.
 
Joined
Mar 10, 2015
Messages
3,984 (1.66/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
Well, if the 3080 sells out, I'll try for 3090. If that sells out, I'll wait for RDNA2. Unless one of the formers comes in stock.
 
Joined
Feb 18, 2017
Messages
674 (0.40/day)
I'm just curious about those people's opinion about the power consumption who in the old days made fun of the 290X or 390X, or even the 480 or Vega cards. What do they say now about the extra 90W power it needs compared to the 2080 or the efficiency increase compared to the 2080 vs. 1080 to the 980?

People should forget about this card performance, power efficiency and price/performance at 1080 and 1440p - this card is made for 4K unless you wanna get 500fps@1080p in your favourite online shooter. Period.

Overall, I'm pleased with what NVIDIA has achieved with this generation but since I'm not a filthy rich European/US citizen who can afford a high refresh 4K HDR IPS monitor, I will simply forget about it.

To be honest over 95% of gamers may simply avoid reading this review as it's beyond their budget.

AMD fans who are now crying wolf about this card characteristics at lower resolutions (below 4K) should be ignored with prejudice. They simply don't understand the point of this GPU.






Wut? There are already released games that won't run fix 60 fps in 4K with the 3080. What do you say about the extra 90W power consumption compared to the 2080? You must have doomed AMD for the consumption of even the RX 480 back to those days. :) And the next gen games are only arriving in 2021 and onward.
 
Last edited:
Joined
Feb 8, 2009
Messages
71 (0.02/day)
Location
Solihull, England, UK
Those power consumption figures are absolutely horrible. So much for VGA firms going for more efficient, power saving architectures. It looks liek the progress stopped with the 1070 / 1080 series on that front.
 
Joined
Jun 26, 2018
Messages
20 (0.02/day)
Location
USA
System Name Cougar Conquer Custom
Processor Intel i9-9900k@5 GHz
Motherboard Gigabyte Aorus Master Z390
Cooling EKWB Custom Hardline
Memory 32 GB Corsair Vengeance
Video Card(s) ASUS Strix 3080 (soon)
Storage 1 TB NVME, 500 GB NVME
Display(s) BenQ Zowie XL2746S
Case Cougar Conquer
Power Supply Corsair AX1200i
Mouse Logitech GPW
Keyboard Fnatic Ministreak
Those power consumption figures are absolutely horrible. So much for VGA firms going for more efficient, power saving architectures. It looks liek the progress stopped with the 1070 / 1080 series on that front.

I think it has to do with Samsung rather than them not trying to be more energy efficient.
 
Joined
Mar 20, 2019
Messages
99 (0.11/day)
Processor R7 5800X
Motherboard Asus Rog Strix B550 I
Cooling Fractal Celsius
Memory 32
Video Card(s) EVGA RTX 3080 Hybrid
Storage Lots
Display(s) LG 4k
Case Fractal Nano S
Audio Device(s) Vintage
Power Supply EVGA 650 SFF
Mouse EVGA X17
Keyboard EVGA Z15
If youre truly worried about the power consumption, play games on your phone.
 
Joined
Jul 7, 2019
Messages
141 (0.17/day)
Great review. Thanks.

3080 for 1080P seems like an overkill but with high FPS % performance difference vs Pascal is great, I run a Maxwell card, that makes this supremely fast. Then for new titles which utilize more complex engines and newer assets for 9th gen it would become superb at1080P High Refresh rates with RT would be much better than dropping to 60FPS standard refresh rate for higher resolution 4K. 1440P seems apt in the middle ground, but looks like we need Ryzen 4000 and Intel Rocketlake to truly push more FPS at 1080P for these existing titles and old ones. I will wait for them and also wait for the 3080 SUPER / TI with 14/16GB G6X refresh that probably is inevitable, so making it a long life purchase for 1080P/1440P, 4K with RT will take a massive hit at sub 60FPS and then need DLSS too.


As for the design, do not like that ugly 12Pin connector, it's standing out like a sore thumb and the cabling will look pathetic no matter what you do, other than that the card's full metal design trumps AIB cards, a shame that many AIBs are making crappy looking GPUs, like EVGA of all made ugliest design who make a solid Dark series Mobo. ASUS Is too stylized vs their 2080 Strix, will wait for those reviews too as for that weird fan that pumps to the VRM / CPU / Mem, too many flaws in FE for that stunning look, not worth.

Also shame that Nvidia dropped NVLink, esp given how these 2 cards 3080 SLI reach 3090 price and destroy in performance too also the fact that GA102 loses out NVLink this time.


Much needed for PCMR.
 
Joined
May 11, 2009
Messages
1,064 (0.24/day)
Location
Colorado
System Name not sure yet
Processor R9 3950X
Motherboard Asrock X570 Taichi
Cooling It's a secret for now..
Memory 64GB Vengeance
Video Card(s) 2080Ti FE
Storage 2x 1TB NVME, 3x 500GB SSD, 2TB SSHD
Display(s) Predator 27" 4k 144hz, 55" Vizio 4k
Case Lian Li O11-D White
Audio Device(s) Steel Series Wireless
Power Supply EVGA 1200W
Mouse Glorious D-
Keyboard GMMK 60% Red Box Caps
Software Windows 8.1 Profession x64
@W1zzard Is there any plan to add the Call of Duty series back into reviews any time soon? Seems like it's getting a lot more attention on PC lately.
 
Top