• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3080 12 GB Edition Rumored to Launch on January 11th

AleksandarK

News Editor
Staff member
Joined
Aug 19, 2017
Messages
2,190 (0.91/day)
During the CES 2022 keynote, we have witnessed NVIDIA update its GeForce RTX 30 series family with GeForce RTX 3050 and RTX 3090 Ti. However, this is not an end to NVIDIA's updates to the Ampere generation, as we now hear industry sources from Wccftech suggest that we could see a GeForce RTX 3080 GPU with 12 GB of GDDR6X VRAM enabled, launched as a separate product. Compared to the regular RTX 3080 that carries only 10 GB of GDDR6X, the new 12 GB version is supposed to bring a slight bump up to the specification list. The GA102-220 GPU SKU found inside the 12 GB variant will feature 70 SMs with 8960 CUDA, 70 RT cores, and 280 TMUs.

This represents a minor improvement over the regular GA102-200 silicon inside the 8 GB model. However, the significant difference is the memory organization. With the new 12 GB model, we have a 384-bit memory bus allowing GDDR6X modules to achieve a bandwidth of 912 GB/s, all while running at 19 Gbps speeds. The overall TDP will also receive a bump to 350 Watts, compared to 320 Watts of the regular RTX 3080 model. For more information regarding final clock speeds and pricing, we have to wait for the alleged launch date - January 11th.


View at TechPowerUp Main Site
 

aQi

Joined
Jan 23, 2016
Messages
645 (0.22/day)
Keep playing Nvidia. Amd is already managing to get parallel performance while introducing new technologies harmonising gpu with cpu advantages as it has its own production of both. Intel will offer the same eventually after we get to see ARC dgpu.
Where does the green team leads us to ? Extra 2gb and slight bumps ?
 
Joined
Mar 28, 2020
Messages
1,632 (1.12/day)
To no fanfare. With this new SKU, it will just end up costing more than the 10GB version, and cheaper than the Ti version. Ultimately, availability is still poor, and, prices still absurdly inflated.
 
Joined
Mar 28, 2020
Messages
1,632 (1.12/day)
Keep playing Nvidia. Amd is already managing to get parallel performance while introducing new technologies harmonising gpu with cpu advantages as it has its own production of both. Intel will offer the same eventually after we get to see ARC dgpu.
Where does the green team leads us to ? Extra 2gb and slight bumps ?
They will lead us to more RT and Tensor cores, if not something proprietary with the next gen. This is why Nvidia is so keen to pay and acquire ARM to mitigate the disadvantage here.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
15,800 (4.58/day)
Location
Kepler-186f
They will lead us to more RT and Tensor cores, if not something proprietary with the next gen. This is why Nvidia is so keen to pay and acquire ARM to mitigate the disadvantage here.

yeah they won't get ARM, the world has already made that clear to Nvidia, Jensen can hug his leather jackets at night and cry over it if he wants lol
 

aQi

Joined
Jan 23, 2016
Messages
645 (0.22/day)
They will lead us to more RT and Tensor cores, if not something proprietary with the next gen. This is why Nvidia is so keen to pay and acquire ARM to mitigate the disadvantage here.
That arm acquisition can effect the handheld battle quite well as AMD and samsung combinely bringing rdna2 to portable through exynos this month. Yet seeing nvidia develop an risc based processor to compete here will be something neat rather then playing with vram increase and resizing of tensor cores. Admittedly nvidia introduced ray tracing but its of no use if the end user cannot enjoy it. Thats what Intel and Amd are aiming for. And upcoming igpu would also have ray tracing.
 

TheHughMan

New Member
Joined
May 26, 2021
Messages
21 (0.02/day)
Despite mining at full capacity with no room for more growth they still buy all the GPUs
 
Joined
Feb 26, 2016
Messages
546 (0.18/day)
Location
Texas
System Name O-Clock
Processor Intel Core i9-9900K @ 52x/49x 8c8t
Motherboard ASUS Maximus XI Gene
Cooling Corsair H170i Elite Cappelix w/ NF-A14 iPPC IP67 fans
Memory 2x16GB G.Skill TridentZ @3900 MHz CL16
Video Card(s) EVGA RTX 2080 Ti XC Black
Storage Samsung 983 ZET 960GB, 2x WD SN850X 4TB
Display(s) Asus VG259QM
Case Corsair 900D
Audio Device(s) beyerdynamic DT 990 600Ω, Asus SupremeFX Hi-Fi 5.25", Elgato Wave 1
Power Supply EVGA 1600 T2 w/ NF-A14 iPPC IP67 fan
Mouse Logitech G403 Wireless (PMW3366)
Keyboard Logitech G910 Stickerbombed
Software Windows 10 Pro 64 bit
Benchmark Scores https://hwbot.org/search/submissions/permalink?userId=92615&cpuId=5773
It's like MSRP is meaningless
NVIDIA and AMD should stop putting MSRPs for these graphics card, and let the public decide what the price is. Auction them off like Intel did with the 9990XE. No fucking point of MSRP anymore.
 
Joined
Apr 19, 2018
Messages
958 (0.44/day)
Processor AMD Ryzen 9 5950X
Motherboard Asus ROG Crosshair VIII Hero WiFi
Cooling Arctic Liquid Freezer II 420
Memory 32Gb G-Skill Trident Z Neo @3806MHz C14
Video Card(s) MSI GeForce RTX2070
Storage Seagate FireCuda 530 1TB
Display(s) Samsung G9 49" Curved Ultrawide
Case Cooler Master Cosmos
Audio Device(s) O2 USB Headphone AMP
Power Supply Corsair HX850i
Mouse Logitech G502
Keyboard Cherry MX
Software Windows 11
Pay $2000 for one of these now, and in less than a years time, 16GB will be the new standard on all mid to high end cards. nGreedia have been very clever with their drip drip dripping of higher memory densities, which should have never been a thing in the first place!
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
15,800 (4.58/day)
Location
Kepler-186f
Pay $2000 for one of these now, and in less than a years time, 16GB will be the new standard on all mid to high end cards. nGreedia have been very clever with their drip drip dripping of higher memory densities, which should have never been a thing in the first place!

ya and they won't be able to justify these prices in 2024 for next gen cards, AMD will undercut the living **** out of them. if it wasn't for AMD, Nvidia would never come down in price in 2024. also lets hope Intel gpu's are competitive, the more the better to keep these greedy bastards in check.
 
Joined
Sep 17, 2014
Messages
20,780 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Pay $2000 for one of these now, and in less than a years time, 16GB will be the new standard on all mid to high end cards. nGreedia have been very clever with their drip drip dripping of higher memory densities, which should have never been a thing in the first place!

Yep, because that's where the consoles are. The trend is already in effect as it was even prior to Ampere's launch. 8GB required is already 'mid range territory' even below 4K.

Nvidia is being the Intel right now, pushing metrics nobody cares about for excessive power figures. Pioneering wild technologies that have no support in the industry like RT or E-cores, to save face. Well, someone's gotta be first right. So far the price of RT is two generations of utter shite GPUs and exploding TDPs. And its not just crypto making them unobtanium either, the dies are big to begin with.

We're still transitioning and Turing + Ampere are still early adopter territory with subpar spec. The industry is still adjusting. I wonder what rabbits Nvidia's Hopper (wasn't it?) will pull out.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
15,800 (4.58/day)
Location
Kepler-186f
Yep, because that's where the consoles are. The trend is already in effect as it was even prior to Ampere's launch. 8GB required is already 'mid range territory' even below 4K.

Nvidia is being the Intel right now, pushing metrics nobody cares about for excessive power figures. Pioneering wild technologies that have no support in the industry like RT or E-cores, to save face. Well, someone's gotta be first right. So far the price of RT is two generations of utter shite GPUs and exploding TDPs. And its not just crypto making them unobtanium either, the dies are big to begin with.

We're still transitioning and Turing + Ampere are still early adopter territory with subpar spec. The industry is still adjusting. I wonder what rabbits Nvidia's Hopper (wasn't it?) will pull out.

if you have not read my hypothesis on the shortage, I recommend it here:

 
Joined
Sep 17, 2014
Messages
20,780 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
15,800 (4.58/day)
Location
Kepler-186f
Yeah I think its nonsense, sorry.

seriously? seems quite obvious to me. there are loads of third party sellers now on amazon/walmart, etc. way way more than their used to be. its an easy way to make a quick buck. lot of them have bots, supply will never meet demand cause there are so many third party sellers now that suck up items before they drop, only way to beat them is if walmart and best buy, etc start selling ps5's or gpu'
s in store only.

no need to be sorry, we can agree to disagree
 
Joined
Dec 17, 2011
Messages
359 (0.08/day)


Hmm... I can't help but wonder what would have happened to TDP if Nvidia chose to go with GDDR6 instead of GDDR6X. RTX 3070 increased its power consumption by 30% when it switched to GDDR6X with 3070 Ti.

Of course then it would be a side grade from RTX 3080's 10 GB 760 GBps memory (320-bit * GDDR6X 19 Gbps) to a 12 GB 768 GBps memory (384-bit * GDDR6 16 Gbps). But we would see much lower power consumption, maybe a sub-300 watt 3080.
 
Top