• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3090 and 3080 Specifications Leaked

Joined
Apr 10, 2020
Messages
480 (0.33/day)
RTX 3080
10 GB of GDDR6X memory running on a 320-bit bus at 19 Gbps (memory bandwidth capacity of 760 GB/s)
4,352 CUDA cores running at 1710 MHz
320 W TGP (board power)
-------------------------------------------------------------
FP32 (float) performance = 14.88 ?????????????

RTX 2080TI

11 GB of GDDR6 memory running on a 352-bit bus at 14 Gbps (memory bandwidth capacity of 616 GB/s)
4,352 CUDA cores running at 1545 MHz
250 W TGP (board power)
------------------------------------------------------------
FP32 (float) performance = 13.45

This can't be right, only 10 % rasterization performance gain over 2080TI? I REALY hope Nvidia managed to get some IPC gain due to new node/arch or it's gonna be Turing Deja Vu all over again :(
 
Joined
Feb 20, 2020
Messages
9,340 (6.14/day)
Location
Louisiana
System Name Ghetto Rigs z490|x99|Acer 17 Nitro 7840hs/ 5600c40-2x16/ 4060/ 1tb acer stock m.2/ 4tb sn850x
Processor 10900k w/Optimus Foundation | 5930k w/Black Noctua D15
Motherboard z490 Maximus XII Apex | x99 Sabertooth
Cooling oCool D5 res-combo/280 GTX/ Optimus Foundation/ gpu water block | Blk D15
Memory Trident-Z Royal 4000c16 2x16gb | Trident-Z 3200c14 4x8gb
Video Card(s) Titan Xp-water | evga 980ti gaming-w/ air
Storage 970evo+500gb & sn850x 4tb | 860 pro 256gb | Acer m.2 1tb/ sn850x 4tb| Many2.5" sata's ssd 3.5hdd's
Display(s) 1-AOC G2460PG 24"G-Sync 144Hz/ 2nd 1-ASUS VG248QE 24"/ 3rd LG 43" series
Case D450 | Cherry Entertainment center on Test bench
Audio Device(s) Built in Realtek x2 with 2-Insignia 2.0 sound bars & 1-LG sound bar
Power Supply EVGA 1000P2 with APC AX1500 | 850P2 with CyberPower-GX1325U
Mouse Redragon 901 Perdition x3
Keyboard G710+x3
Software Win-7 pro x3 and win-10 & 11pro x3
Benchmark Scores Are in the benchmark section
HI,
Waiting for reviews and then some before even thinking of 30 series after 20 series space invaders :)
But from the looks of it hydro copper all the way this ugle air cooler I'll never see in person.
 
Joined
May 18, 2019
Messages
34 (0.02/day)
This TGP rather than TDP is going to be annoying and cause confusion for people...Can see it in this thread already.

I mean how do you compared TGP to the old TDP values. is the 350W TGP gonna be similar to the 250W TDP of the 2080Ti, plus extra for board power? But then can you see the rest of the components using another 100W?
 
Joined
Aug 31, 2016
Messages
104 (0.04/day)
So it is basically confirmed that 3080 will be at least 20% faster than 2080 Ti if it has the same amount of CUDA cores but on a new architecture, there is no way for it to deliver less considering the spec. 3090 should be at least 50%. This should be like bare minimum conservative expectation at this point.
 
Joined
Feb 1, 2013
Messages
1,248 (0.30/day)
System Name Gentoo64 /w Cold Coffee
Processor 9900K 5.2GHz @1.312v
Motherboard MXI APEX
Cooling Raystorm Pro + 1260mm Super Nova
Memory 2x16GB TridentZ 4000-14-14-28-2T @1.6v
Video Card(s) RTX 4090 LiquidX Barrow 3015MHz @1.1v
Storage 660P 1TB, 860 QVO 2TB
Display(s) LG C1 + Predator XB1 QHD
Case Open Benchtable V2
Audio Device(s) SB X-Fi
Power Supply MSI A1000G
Mouse G502
Keyboard G815
Software Gentoo/Windows 10
Benchmark Scores Always only ever very fast
If true, this will be the first time after GTX 580 that NVidia is going >256-bit for the *80 again, but still severely short of 384-bits though and with a price tag to choke on.
 
Joined
Apr 16, 2013
Messages
534 (0.13/day)
Location
Bulgaria
System Name Black Knight | White Queen
Processor Intel Core i9-10940X | Intel Core i7-5775C
Motherboard ASUS ROG Rampage VI Extreme Encore X299G | ASUS Sabertooth Z97 Mark S (White)
Cooling Noctua NH-D15 chromax.black | Xigmatek Dark Knight SD-1283 Night Hawk (White)
Memory G.SKILL Trident Z RGB 4x8GB DDR4 3600MHz CL16 | Corsair Vengeance LP 4x4GB DDR3L 1600MHz CL9 (White)
Video Card(s) ASUS ROG Strix GeForce RTX 4090 OC | KFA2/Galax GeForce GTX 1080 Ti Hall of Fame Edition
Storage Samsung 990 Pro 2TB, 980 Pro 1TB, 850 Pro 256GB, 840 Pro 256GB, WD 10TB+ (incl. VelociRaptors)
Display(s) Dell Alienware AW2721D 240Hz, ASUS ROG Strix XG279Q 170Hz, ASUS PA246Q 60Hz| Samsung JU7500 48'' TV
Case Corsair 7000D AIRFLOW (Black) | NZXT ??? w/ ASUS DRW-24B1ST
Audio Device(s) ASUS Xonar Essence STX | Realtek ALC1150
Power Supply Enermax Revolution 1250W 85+ | Super Flower Leadex Gold 650W (White)
Mouse Razer Basilisk Ultimate, Razer Naga Trinity | Razer Mamba 16000
Keyboard Razer Blackwidow Chroma V2 (Orange switch) | Razer Ornata Chroma
Software Windows 10 Pro 64bit
Can't wait to grab my KFA2 HOF RTX 3090 baby. :love:
 
Joined
May 15, 2014
Messages
235 (0.06/day)
This TGP rather than TDP is going to be annoying and cause confusion for people...Can see it in this thread already.

I mean how do you compared TGP to the old TDP values. is the 350W TGP gonna be similar to the 250W TDP of the 2080Ti, plus extra for board power? But then can you see the rest of the components using another 100W?
Yep. Been doing it myself using TBP instead of TGP. Nv says they're similar. I wouldn't be using frameview, though...
 

Dux

Joined
May 17, 2016
Messages
511 (0.18/day)
If those specs are real, AMD could easily beat them TFLOPS for TFLOPS. Considering XBOX Series X has a 12 TFLOPS GPU and you can't go wild with TDP in a small console. I think Nvidia will be mostly pushing for Ray Tracing performance gains. Again..if these specs are real. 17,8 TFLOPS on RTX 3090 by the math. 5248 CUDA cores x 2 = 10496 X 1695Mhz clock = 17,8 TFLOPS GPU.
 
Joined
Jan 21, 2020
Messages
109 (0.07/day)
Edit: Maybe comparing to the RTX 3080 is more informative:

RTX 3080 4352 CUDA, 1710 MHz boost, 10 GB, 760 GB/s, 320 W, 7 nm process

RTX 2080 Ti 4352 CUDA, 1545 MHz boost, 11 GB, 616 GB/s, 250 W, 12 nm process

Almost no difference between these cards except on the RT and Tensor side. If the price is much lower than $1000 for the 3080 then you can get 2080 Ti performance on the 'cheap'.

This could be important. Same amount of CUDA cores, yet quite a bit higher TDP rating even though it's on 7nm EUV versus 12nm means one thing and one thing only: The amount of RT and Tensor cores will be MASSIVELY higher.
 
Joined
Nov 23, 2010
Messages
313 (0.06/day)
According to this rumor the RTX 3080 will have variants and some of those variants will cram 20GB of VRAM:

The card is reportedly going to feature up to 10 GB of memory that is also going to be GDDR6X but there are several vendors who will be offering the card with a massive 20 GB frame buffer but at higher prices. Since the memory is running at 19 Gbps across a 320-bit bus interface, we can expect a bandwidth of up to 760 GB/s.

 

ppn

Joined
Aug 18, 2015
Messages
1,231 (0.39/day)
Good. Still 2 8 pin connectors from board partners. Nothing has changed. Just nvidia being weird.

12pin is hidden under the shroud. this is just for the purposes for making detachable shroud with integrated 8pins insead of being soldered like 1060,2060 FE.

in case the dies are bigger than 429mm2 can't possibly be EUV .
 
Joined
May 15, 2020
Messages
697 (0.49/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
If those specs are real, AMD could easily beat them TFLOPS for TFLOPS. Considering XBOX Series X has a 12 TFLOPS GPU and you can't go wild with TDP in a small console. I think Nvidia will be mostly pushing for Ray Tracing performance gains. Again..if these specs are real. 17,8 TFLOPS on RTX 3090 by the math. 5248 CUDA cores x 2 = 10496 X 1695Mhz clock = 17,8 TFLOPS GPU.
Not to mention AMD have significantly higher speeds, and performance scales much better with frequency than with cores.

Problem is, we have no idea what AMD is preparing with RDNA2, outside of reasonable guesses based on those console APUs.
 
Joined
Apr 10, 2020
Messages
480 (0.33/day)
WTF is happening with TDPs of xx80 series? Smaller nodes, yet higher TDPs. We're coming to IR panel wattage here. 3080 alone can heat up 6 m² room, add 160W CPU on top of that and you get a decent standalone room heater. I'm not gonna install AC just to use PC :(

GTX 980... 165 W
GTX 1080...180 W
RTX 2080(S)... 215/250 W
RTX 3080... 320W
 
Joined
Dec 30, 2010
Messages
2,087 (0.43/day)
This is just a enterprise card designed for AI / DL / whatever workload being pushed into gaming. These cards normally fail the enterprise quality stamp. So having up to 350W of TDP / TBP is not unknown. It's like Linus torwards said about Intel: Stop putting stuff in chips that only make themself look good in really specific (AVX-512) workloads. These RT/Tensor cores proberly count up big for the extra power consumption.

Price is proberly in between 1000 and 2000$. Nvidia is the new apple.
 
Joined
Jan 21, 2020
Messages
109 (0.07/day)
WTF is happening with TDPs of xx80 series? Smaller nodes, yet higher TDPs. We're coming to IR panel wattage here. 3080 alone can heat up 6 m² room, add 160W CPU on top of that and you get a decent standalone room heater. I'm not gonna install AC just to use PC :(

GTX 980... 165 W
GTX 1080...180 W
RTX 2080(S)... 215/250 W
RTX 3080... 320W
Simple. There will be a LOT more RT cores on the GPU.
 
Joined
Jul 9, 2015
Messages
3,413 (1.06/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Not to mention AMD have significantly higher speeds, and performance scales much better with frequency than with cores.
There is no word on whether 1.7Ghz is base or boost frequency.
Given that Sony can push its RDNA chip to 2.1Ghz, if 7nm is true, NV boost frequency must be higher than that.

Why on earth would RT cores need to consume anything, unless one of those dozen games that support it is running?
 
Joined
May 2, 2016
Messages
171 (0.06/day)
I'll most probably pass. 1080ti should be enough for a couple more years, until we have some real next gen games and until we see if ray tracing really proceeds or it'll remain a gimmick. By then we should have affordable gpus with more vram than 1080Ti's 11GB.
 
Joined
Dec 31, 2009
Messages
19,366 (3.71/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
I understand that, but it's still crazy. Gaming PC should not consume +500W of power. Most of PC gamers don't live in Alaska or Greenland like places.
and even then, room temp is still 22C. You act like they need to sit outside to be cooled. Remember the 295x2? A 500W gpu...

until we see if ray tracing really proceeds
wait... so consoles and amd are going RT and you have a question on if it proceeds?
 

Dux

Joined
May 17, 2016
Messages
511 (0.18/day)
There is no word on whether 1.7Ghz is base or boost frequency.
Given that Sony can push its RDNA chip to 2.1Ghz, if 7nm is true, NV boost frequency must be higher than that.


Why on earth would RT cores need to consume anything, unless one of those dozen games that support it is running?
1.7GHz is around the boost clock for RTX 2000 series, is it not?
Edit. It says 1710MHz boost for RTX3080
 
Joined
Feb 11, 2009
Messages
5,397 (0.97/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> ... nope still the same :'(
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
honestly to me this feels a bit Intel 6700k to 7700k type of stuff.

just kinda meh all around...and really 10gb of ram for a 3080? I would atleast have given it 16 or so.

oh welll, guess just like with Intel, this is what you get with no competition in that area, remember the RX5700(XT) is really more around RTX2060s / RTX2070s territory.
 
Last edited:
Joined
May 2, 2016
Messages
171 (0.06/day)
and even then, room temp is still 22C. You act like they need to sit outside to be cooled. Remember the 295x2? A 500W gpu...

wait... so consoles and amd are going RT and you have a question on if it proceeds?
I'm talking about nvidia's approach (RTX and dedicated hardware), not ray tracing as a technology. Consoles include AMD and it'll be working in a different way.
 
Joined
May 15, 2020
Messages
697 (0.49/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
There is no word on whether 1.7Ghz is base or boost frequency.
Given that Sony can push its RDNA chip to 2.1Ghz, if 7nm is true, NV boost frequency must be higher than that.
Why on earth would RT cores need to consume anything, unless one of those dozen games that support it is running?

Oh, I imagine 1.7 is base, not boost. But 400MHz boost gain is way overoptimistic, IMO, it will be more around 200-250MHz.
And 7nm would be at least surprising at this point, but who knows.
 
Joined
Apr 10, 2020
Messages
480 (0.33/day)
and even then, room temp is still 22C. You act like they need to sit outside to be cooled. Remember the 295x2? A 500W gpu...

wait... so consoles and amd are going RT and you have a question on if it proceeds?
What do you mean? My PC room's temperature warms up to 26C (and above when outside temps are hitting +35C for a few days) during summer months and I live in a well isolated house positioned in moderate climate. Add +500W PC into the room and you get easily above 28C. I underclock 1080TI during summer months to get it to consume around 160W during gaming, but I can't see how I could underclock 320W GPU to get similar results.
 
Joined
Jan 21, 2020
Messages
109 (0.07/day)
Why on earth would RT cores need to consume anything, unless one of those dozen games that support it is running?
Trying to pull 2 year old arguments? Hardly. Given the fact that many AAA titles are raytracing enabled, the fact that main game engines like UE now support raytracing and some of the biggest games are going to be raytraced - Cyberpunk 2077, Minecraft just for an example - nobody believes those old arguments anymore. And you seem to have a bit of a split personality - your beloved AMD is saying new consoles and their RDNA2 GPUs will support raytracing as well. So what are you really trying to say? Are you trying to prepare for the eventuality that AMD's raytracing performance sucks?
 
Top