• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3090 and 3080 Specifications Leaked

Joined
May 15, 2020
Messages
697 (0.49/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
1.7GHz is around the boost clock for RTX 2000 series, is it not?
Edit. It says 1710MHz boost for RTX3080
Ouch, it does indeed say boost, but things really don't add up. AMD managed to boost their first 7nm iteration at 1.9GHz, why would Nvidia only manage 1.7? I would believe this only if it was on Samsung 8nm and if it's really not a good node...
 

Dux

Joined
May 17, 2016
Messages
511 (0.18/day)
Ouch, it does indeed say boost, but things really don't add up. AMD managed to boost their first 7nm iteration at 1.9GHz, why would Nvidia only manage 1.7? I would believe this only if it was on Samsung 8nm and if it's really not a good node...
Well, it says 1650MHz boost for my RTX 2060 Super. But still it boosts to 1850MHz out of the box. And 2000MHz + with slight OC.
 
Joined
Apr 10, 2020
Messages
480 (0.33/day)
Ouch, it does indeed say boost, but things really don't add up. AMD managed to boost their first 7nm iteration at 1.9GHz, why would Nvidia only manage 1.7? I would believe this only if it was on Samsung 8nm and if it's really not a good node...
It is on Samsung's 8nm (comparable to TSCM's 10nm), all leakers pointed in that direction (Samsung's "5nm" in base case scenario). And yes, it is a shitty node compared to TSMC's 7nm EUV, hence shitty clock speeds.

Well, it says 1650MHz boost for my RTX 2060 Super. But still it boosts to 1850Mhz out of the box. And 2000Mhz + with slight OC.

5240 shading units GPU is a different beast than 1920 SU GPU... More SU lower clock speeds.
 

M2B

Joined
Jun 2, 2017
Messages
284 (0.11/day)
Location
Iran
Processor Intel Core i5-8600K @4.9GHz
Motherboard MSI Z370 Gaming Pro Carbon
Cooling Cooler Master MasterLiquid ML240L RGB
Memory XPG 8GBx2 - 3200MHz CL16
Video Card(s) Asus Strix GTX 1080 OC Edition 8G 11Gbps
Storage 2x Samsung 850 EVO 1TB
Display(s) BenQ PD3200U
Case Thermaltake View 71 Tempered Glass RGB Edition
Power Supply EVGA 650 P2
That 1.7GHz boost is not indicator of the actual gaming load boosts.
The 2080Ti for example is rated for 1545MHz and you'll never see one under 1750MHz in actual games, unless there is something broken with the cooling.
 
Joined
May 8, 2018
Messages
1,495 (0.69/day)
Location
London, UK
"HDMI 2.1 and DisplayPort 1.4a. "

Finally, now, all monitors that requires a huge bandwidth will come with hdmi 2.1.
 

Dux

Joined
May 17, 2016
Messages
511 (0.18/day)
It is on Samsung's 8nm (comparable to TSCM's 10nm), all leakers pointed in that direction (Samsung's "5nm" in base case scenario). And yes, it is a shitty node compared to TSMC's 7nm EUV, hence shitty clock speeds.
If i remember correctly, Nvidia said that AMD reserved most of the 7nm over at TSMC. But they went on ahead and beat AMD into reserving 5nm for next year. So wait for RTX 3000 Super series for TSMC 5nm.
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,909 (2.42/day)
Location
Louisiana -Laissez les bons temps rouler!
System Name Bayou Phantom
Processor Core i7-8700k 4.4Ghz @ 1.18v
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax T40F Black CPU cooler
Memory 2x 16GB Mushkin Redline DDR-4 3200
Video Card(s) EVGA RTX 2080 Ti Xc
Storage 1x 500 MX500 SSD; 2x 6TB WD Black; 1x 4TB WD Black; 1x400GB VelRptr; 1x 4TB WD Blue storage (eSATA)
Display(s) HP 27q 27" IPS @ 2560 x 1440
Case Fractal Design Define R4 Black w/Titanium front -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic X-850
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
only 10GB on 3080? Seriously? And 8GB on 3070 is same we had ever since Pascal already for x70. That's lame.
Do you really “need” more than 10GB VRAM?
 
Joined
Apr 10, 2020
Messages
480 (0.33/day)
If i remember correctly, Nvidia said that AMD reserved most of the 7nm over at TSMC. But they went on ahead and beat AMD into reserving 5nm for next year. So wait for RTX 3000 Super series for TSMC 5nm.
Not a chance. Apple and AMD are TSMC's prime partners. NVidia can get a piece of 5nm production, but not the production scale it needs for PC gaming GPUs. I can see 4xxx HPC and high end Quadros being on TSCM's 5nm but nothing else.
 
Joined
Dec 31, 2009
Messages
19,366 (3.71/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
I'm talking about nvidia's approach (RTX and dedicated hardware), not ray tracing as a technology. Consoles include AMD and it'll be working in a different way.
The point, however, is that RT tech is here... it isn't a gimmick with everyone all in. Capeesh? :)
What do you mean? My PC room's temperature warms up to 26C (and above when outside temps are hitting +35C for a few days) during summer months and I live in a well isolated house positioned in moderate climate. Add +500W PC into the room and you get easily above 28C. I underclock 1080TI during summer months to get it to consume around 160W during gaming, but I can't see how I could underclock 320W GPU to get similar results.
Have a cookie...just saying 300W is nothing compared to some cards in the past. :)
 
Joined
Sep 17, 2014
Messages
20,917 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Theyre having a laugh. I might hard pass on this for another gen. Still left wondering how on earth this is all worth it just for a handful of pretty weak RT titles...
 
Joined
May 15, 2020
Messages
697 (0.49/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
It is on Samsung's 8nm (comparable to TSCM's 10nm), all leakers pointed in that direction (Samsung's "5nm" in base case scenario). And yes, it is a shitty node compared to TSMC's 7nm EUV, hence shitty clock speeds.
Well, it seems even the guys from videocardz aren't sure it's 7nm, so it does look quite surprising. However, RDNA1 was on plain 7nm, not P or EUV. But maybe Nvidia boost ratings are conservative, as the others are pointing out, we'll have to see.
 
Joined
Jan 21, 2020
Messages
109 (0.07/day)
Theyre having a laugh. I might hard pass on this for another gen. Still left wondering how on earth this is all worth it just for a handful of pretty weak RT titles...
As in Cyberpunk 2077? Yeah, right, that's a weak title. Minecraft? Yeah, also a weak title. Anything made on UE for the foreseeable future? Also weak and unimportant. Console ports made for DXR? Also unimportant .. ;) Riiiiight.
 
Joined
May 15, 2020
Messages
697 (0.49/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
Not a chance. Apple and AMD are TSMC's prime partners. NVidia can get a piece of 5nm production, but not the production scale it needs for PC gaming GPUs. I can see 4xxx HPC and high end Quadros being on TSCM's 5nm but nothing else.
Indeed and also, TSMC seem to prefer splitting their capacity amongst their clients, rather than allowing one client to book the whole capacity for a given node.
 
Joined
Mar 10, 2010
Messages
11,878 (2.30/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
If i remember correctly, Nvidia said that AMD reserved most of the 7nm over at TSMC. But they went on ahead and beat AMD into reserving 5nm for next year. So wait for RTX 3000 Super series for TSMC 5nm.
You realise AMD and Tsmc announced a partnership on 5nm before that rumour came out.
As for Tsmc 5nm supers, that's dreamy IMHO.

Soooo 300 watts max some said yesterday, you can't exceed two 8 pins at 300 they said, balls I said you wouldn't need heavier gauge wire.

350 watts at base clocks, what's the max OC pull on that 500?, We'll see.

I own a Vega 64 ,course you can, and I'll take this opportunity to welcome Nvidia back in to forman grill territory, it's been lonely here for two years:p
 
Joined
May 25, 2019
Messages
12 (0.01/day)
So let's see

RTX 3090 5248 CUDA, 1695 MHz boost, 24 GB, 936 GB/s, 350 W, 7 nm process

RTX 2080 Ti 4352 CUDA, 1545 MHz boost, 11 GB, 616 GB/s, 250 W, 12 nm process

I guess all the potential fab process power savings were erased by the extra RAM, RAM speed, CUDA, RT and tensor cores.

Edit: Maybe comparing to the RTX 3080 is more informative:

RTX 3080 4352 CUDA, 1710 MHz boost, 10 GB, 760 GB/s, 320 W, 7 nm process

RTX 2080 Ti 4352 CUDA, 1545 MHz boost, 11 GB, 616 GB/s, 250 W, 12 nm process

Almost no difference between these cards except on the RT and Tensor side. If the price is much lower than $1000 for the 3080 then you can get 2080 Ti performance on the 'cheap'.

There are BIG differences, between the 3080 and the 2080TI.
1. Clock speed. 1710-1545, >10% clock rate. That right there explains MOST of the power difference, as well as the efficiency differences
2. RTX core changes. You can call this IPC, if you wanted to, if my understanding is right. Supposedly the RTX cores are hugely more efficient.
3. Heat. I am only just now experiencing how much heat a video card is going to push into the room. Let's put it this way. You can buy a ~$300 air conditioning unit that exhausts to the outside. Those can put out roughly 8000-12000 BTU of heat (per hour). In my experience, this can cool my bedroom from 80F to 70F in about 10 minutes, Your mileage will vary. A simple conversion shows 320watts is roughly 1,091.9 BTU. If my uneducated, guesswork/screwed up math is close to right, that's roughly 10 degrees per hour that needs to be cooled, or vented out of the room. that's on top of what your computer puts out without the graphics card. I know my room gets HOT if I run games for a few hours solid. it's going to be roughly 30% more heat from the video card. And I'm still running a 10-series card.
 
Joined
Apr 19, 2017
Messages
71 (0.03/day)
Joined
Jan 21, 2020
Messages
109 (0.07/day)
Is this thread about NV GPUs soon to be released or AMD nodes?
Exaclty. From the attempts to downplay raytracing to the AMD fans talking about future nodes, it doesn't seem like the Red team had much confidence in RDNA2 / Big Navi. I for one am really curious just how much Nvidia is going to push raytracing forward. The rumours about a 4x increase could be true, given the TDP and classic CUDA core counts from this leak. Maybe even more than that? Can't wait to see.
 
Joined
May 15, 2020
Messages
697 (0.49/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
Well, a bit more on topic, I saw the poll, and apparently there are 5% of the users willing to wait 5 years or more for Intel to come with competitive high-end desktop graphics :p ...
 
Joined
Nov 6, 2016
Messages
1,575 (0.58/day)
Location
NH, USA
System Name Lightbringer
Processor Ryzen 7 2700X
Motherboard Asus ROG Strix X470-F Gaming
Cooling Enermax Liqmax Iii 360mm AIO
Memory G.Skill Trident Z RGB 32GB (8GBx4) 3200Mhz CL 14
Video Card(s) Sapphire RX 5700XT Nitro+
Storage Hp EX950 2TB NVMe M.2, HP EX950 1TB NVMe M.2, Samsung 860 EVO 2TB
Display(s) LG 34BK95U-W 34" 5120 x 2160
Case Lian Li PC-O11 Dynamic (White)
Power Supply BeQuiet Straight Power 11 850w Gold Rated PSU
Mouse Glorious Model O (Matte White)
Keyboard Royal Kludge RK71
Software Windows 10
Wow, these ARE power hungry, now I'm really starting to believe the leaks that Nvidia was forced to do this because RDNA2 is that competitive and that second biggest Navi will be the price to performance to power usage star of the new gen cards.

Now, I'm really excited for RDNA2.... But, I did just get a 5700xt last November, so maybe I actually just might check out the Xbox Series X in all honesty.... Never been excited about consoles, but it's definitely different this time around
 

Dux

Joined
May 17, 2016
Messages
511 (0.18/day)
Idk should i upgrade from my HD 4850 512MB GDDR3? Not much performance difference, it seems. Minus the lack of Ray tracing. :roll:
 
Joined
Feb 11, 2009
Messages
5,397 (0.97/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> ... nope still the same :'(
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
Do you really “need” more than 10GB VRAM?

bit of a hard question to answer, do you "need" anything in this space? do you even "need" a dedicated gpu?

This is about high end gaming and more Vram to work with is better, higher resolution textures and shadow quality and other stuff.
10gb on a new flagship 3080 is..... just extremely lackluster.

Like I said its like Big N is following Big I and just sell us a nothingburger due to lack of competition in these price brackets.
 
Top