• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3080 12 GB Edition Rumored to Launch on January 11th

AleksandarK

Staff member
Joined
Aug 19, 2017
Messages
1,693 (0.85/day)
During the CES 2022 keynote, we have witnessed NVIDIA update its GeForce RTX 30 series family with GeForce RTX 3050 and RTX 3090 Ti. However, this is not an end to NVIDIA's updates to the Ampere generation, as we now hear industry sources from Wccftech suggest that we could see a GeForce RTX 3080 GPU with 12 GB of GDDR6X VRAM enabled, launched as a separate product. Compared to the regular RTX 3080 that carries only 10 GB of GDDR6X, the new 12 GB version is supposed to bring a slight bump up to the specification list. The GA102-220 GPU SKU found inside the 12 GB variant will feature 70 SMs with 8960 CUDA, 70 RT cores, and 280 TMUs.

This represents a minor improvement over the regular GA102-200 silicon inside the 8 GB model. However, the significant difference is the memory organization. With the new 12 GB model, we have a 384-bit memory bus allowing GDDR6X modules to achieve a bandwidth of 912 GB/s, all while running at 19 Gbps speeds. The overall TDP will also receive a bump to 350 Watts, compared to 320 Watts of the regular RTX 3080 model. For more information regarding final clock speeds and pricing, we have to wait for the alleged launch date - January 11th.


View at TechPowerUp Main Site
 

aQi

Joined
Jan 23, 2016
Messages
622 (0.24/day)
Keep playing Nvidia. Amd is already managing to get parallel performance while introducing new technologies harmonising gpu with cpu advantages as it has its own production of both. Intel will offer the same eventually after we get to see ARC dgpu.
Where does the green team leads us to ? Extra 2gb and slight bumps ?
 
Joined
Mar 28, 2020
Messages
1,450 (1.39/day)
To no fanfare. With this new SKU, it will just end up costing more than the 10GB version, and cheaper than the Ti version. Ultimately, availability is still poor, and, prices still absurdly inflated.
 
Joined
Mar 28, 2020
Messages
1,450 (1.39/day)
Keep playing Nvidia. Amd is already managing to get parallel performance while introducing new technologies harmonising gpu with cpu advantages as it has its own production of both. Intel will offer the same eventually after we get to see ARC dgpu.
Where does the green team leads us to ? Extra 2gb and slight bumps ?
They will lead us to more RT and Tensor cores, if not something proprietary with the next gen. This is why Nvidia is so keen to pay and acquire ARM to mitigate the disadvantage here.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
11,878 (3.92/day)
Location
Kepler-186f
They will lead us to more RT and Tensor cores, if not something proprietary with the next gen. This is why Nvidia is so keen to pay and acquire ARM to mitigate the disadvantage here.

yeah they won't get ARM, the world has already made that clear to Nvidia, Jensen can hug his leather jackets at night and cry over it if he wants lol
 

aQi

Joined
Jan 23, 2016
Messages
622 (0.24/day)
They will lead us to more RT and Tensor cores, if not something proprietary with the next gen. This is why Nvidia is so keen to pay and acquire ARM to mitigate the disadvantage here.
That arm acquisition can effect the handheld battle quite well as AMD and samsung combinely bringing rdna2 to portable through exynos this month. Yet seeing nvidia develop an risc based processor to compete here will be something neat rather then playing with vram increase and resizing of tensor cores. Admittedly nvidia introduced ray tracing but its of no use if the end user cannot enjoy it. Thats what Intel and Amd are aiming for. And upcoming igpu would also have ray tracing.
 

TheHughMan

New Member
Joined
May 26, 2021
Messages
20 (0.03/day)
Despite mining at full capacity with no room for more growth they still buy all the GPUs
 
Joined
Feb 26, 2016
Messages
497 (0.20/day)
Location
Texas
System Name O-Clock
Processor Intel Core i9-9900K @ 5.3 GHz 8c8t
Motherboard ASUS Maximus X Hero (Wi-Fi)
Cooling Corsair H150i Elite Cappelix w/ NF-A12 Chromax fans
Memory 2x16GB G.Skill TridentZ @3500 MHz CL15
Video Card(s) EVGA RTX 2080 Ti Black
Storage Samsung 860 PRO 256GB 2.5" SSD, Intel D3-S4510 7.68TB 2.5" SSD, Samsung 870 Evo 500GB 2.5" SSD
Display(s) Asus VG259QM, Dell P2314H
Case Corsair 900D
Audio Device(s) beyerdynamic DT 990 600Ω
Power Supply EVGA 1200 P2
Mouse Logitech G Pro Wireless
Keyboard Logitech G910 Stickerbombed + Corsair K55
Software Windows 10 Pro 64 bit
Benchmark Scores https://hwbot.org/search/submissions/permalink?userId=92615&cpuId=5773
It's like MSRP is meaningless
NVIDIA and AMD should stop putting MSRPs for these graphics card, and let the public decide what the price is. Auction them off like Intel did with the 9990XE. No fucking point of MSRP anymore.
 
Joined
Apr 19, 2018
Messages
775 (0.44/day)
Processor AMD Ryzen 9 3900X
Motherboard Asus ROG Crosshair VIII Hero WiFi
Cooling Corsair Hydro H115i
Memory 16Gb CL14 Ripjaws V @3666MHz
Video Card(s) MSI GeForce RTX2070
Storage Samsung 970 EVO Plus SSD
Display(s) Korean Unbadged
Case Cooler Master Cosmos
Audio Device(s) O2 USB Headphone AMP
Power Supply Corsair HX850i
Mouse Logitech G703
Keyboard Crap!
Pay $2000 for one of these now, and in less than a years time, 16GB will be the new standard on all mid to high end cards. nGreedia have been very clever with their drip drip dripping of higher memory densities, which should have never been a thing in the first place!
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
11,878 (3.92/day)
Location
Kepler-186f
Pay $2000 for one of these now, and in less than a years time, 16GB will be the new standard on all mid to high end cards. nGreedia have been very clever with their drip drip dripping of higher memory densities, which should have never been a thing in the first place!

ya and they won't be able to justify these prices in 2024 for next gen cards, AMD will undercut the living **** out of them. if it wasn't for AMD, Nvidia would never come down in price in 2024. also lets hope Intel gpu's are competitive, the more the better to keep these greedy bastards in check.
 
Joined
Sep 17, 2014
Messages
17,962 (5.86/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) MSI GTX 1080 Gaming X @ 2100/5500
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define C TG
Audio Device(s) Situational :)
Power Supply EVGA G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Pay $2000 for one of these now, and in less than a years time, 16GB will be the new standard on all mid to high end cards. nGreedia have been very clever with their drip drip dripping of higher memory densities, which should have never been a thing in the first place!

Yep, because that's where the consoles are. The trend is already in effect as it was even prior to Ampere's launch. 8GB required is already 'mid range territory' even below 4K.

Nvidia is being the Intel right now, pushing metrics nobody cares about for excessive power figures. Pioneering wild technologies that have no support in the industry like RT or E-cores, to save face. Well, someone's gotta be first right. So far the price of RT is two generations of utter shite GPUs and exploding TDPs. And its not just crypto making them unobtanium either, the dies are big to begin with.

We're still transitioning and Turing + Ampere are still early adopter territory with subpar spec. The industry is still adjusting. I wonder what rabbits Nvidia's Hopper (wasn't it?) will pull out.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
11,878 (3.92/day)
Location
Kepler-186f
Yep, because that's where the consoles are. The trend is already in effect as it was even prior to Ampere's launch. 8GB required is already 'mid range territory' even below 4K.

Nvidia is being the Intel right now, pushing metrics nobody cares about for excessive power figures. Pioneering wild technologies that have no support in the industry like RT or E-cores, to save face. Well, someone's gotta be first right. So far the price of RT is two generations of utter shite GPUs and exploding TDPs. And its not just crypto making them unobtanium either, the dies are big to begin with.

We're still transitioning and Turing + Ampere are still early adopter territory with subpar spec. The industry is still adjusting. I wonder what rabbits Nvidia's Hopper (wasn't it?) will pull out.

if you have not read my hypothesis on the shortage, I recommend it here:

 
Joined
Sep 17, 2014
Messages
17,962 (5.86/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) MSI GTX 1080 Gaming X @ 2100/5500
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define C TG
Audio Device(s) Situational :)
Power Supply EVGA G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
11,878 (3.92/day)
Location
Kepler-186f
Yeah I think its nonsense, sorry.

seriously? seems quite obvious to me. there are loads of third party sellers now on amazon/walmart, etc. way way more than their used to be. its an easy way to make a quick buck. lot of them have bots, supply will never meet demand cause there are so many third party sellers now that suck up items before they drop, only way to beat them is if walmart and best buy, etc start selling ps5's or gpu'
s in store only.

no need to be sorry, we can agree to disagree
 
Joined
Dec 17, 2011
Messages
358 (0.09/day)


Hmm... I can't help but wonder what would have happened to TDP if Nvidia chose to go with GDDR6 instead of GDDR6X. RTX 3070 increased its power consumption by 30% when it switched to GDDR6X with 3070 Ti.

Of course then it would be a side grade from RTX 3080's 10 GB 760 GBps memory (320-bit * GDDR6X 19 Gbps) to a 12 GB 768 GBps memory (384-bit * GDDR6 16 Gbps). But we would see much lower power consumption, maybe a sub-300 watt 3080.
 
Top