• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Readies RTX 3060 8GB and RTX 3080 20GB Models

Joined
Nov 11, 2016
Messages
3,045 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
AMD buyer's cycle: "I'm waiting for the next generation" [New product launches] "Seems nice, but I'll wait for the next one which will be even better"

Nvidia buyer's cycle: "I NEED IT RIGHT NOW, GIMME GIMME" [New product launches] "SHIT, I SHOULD'VE WAITED FOR THE BETTER VERSION"

Perfect summary, and that's why Nvidia is so filthy rich right now :respect:
 
Joined
Jul 5, 2013
Messages
25,559 (6.52/day)
It's not listed, but could see a RTX 3060S 16GB being possible as well eventually.
That would be interesting.

Whatever happened to just giving the different levelled tiers of GPUs normal amounts of VRAM... 1, 2, 4, 8, 16, 32GB etc.. I don't get it why we need 10GB or 20GB....
With you on that one. I want a 16GB model of the 3080. I don't mind if it's only 256bit memory bus. A 16GB 3070 would also be good.
 

r9

Joined
Jul 28, 2008
Messages
3,300 (0.58/day)
System Name Primary|Secondary|Poweredge r410|Dell XPS|SteamDeck
Processor i7 11700k|i7 9700k|2 x E5620 |i5 5500U|Zen 2 4c/8t
Memory 32GB DDR4|16GB DDR4|16GB DDR4|32GB ECC DDR3|8GB DDR4|16GB LPDDR5
Video Card(s) RX 7800xt|RX 6700xt |On-Board|On-Board|8 RDNA 2 CUs
Storage 2TB m.2|512GB SSD+1TB SSD|2x256GBSSD 2x2TBGB|256GB sata|512GB nvme
Display(s) 50" 4k TV | Dell 27" |22" |3.3"|7"
VR HMD Samsung Odyssey+ | Oculus Quest 2
Software Windows 11 Pro|Windows 10 Pro|Windows 10 Home| Server 2012 r2|Windows 10 Pro
What are the expectations price performance for RTX3060?

Price ~$350
Perform ~RTX2070
 
Last edited:
Joined
Jan 8, 2017
Messages
8,862 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
A 20GB 3080 would definitely be more enticing but I don't want to find out what the price would be. By the way, 12GB is much more likely than 20.

Whatever happened to just giving the different levelled tiers of GPUs normal amounts of VRAM... 1, 2, 4, 8, 16, 32GB etc.. I don't get it why we need 10GB or 20GB....

You can't just put any memory configuration, the VRAM capacity is linked with that the GPU's memory controllers and bus interfaces can do.
 
Joined
Jan 14, 2019
Messages
9,725 (5.12/day)
Location
Midlands, UK
System Name Nebulon-B Mk. 4
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 7800 XT
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Benchmark Scores Cinebench R23 single-core: 1,800, multi-core: 18,000. Superposition 1080p Extreme: 9,900.
Are people that desperate they can't wait another month to see what AMD have to offer? Nvidia have a history of mugging off their customers, day one purchasers are setting themselves up for buyers remorse.

Give it a month or two and who knows, Turing cards may start tumbling or AMD could knock it out the park. Personally i'm just being sensible and buying a PS5 for the same cost as what one of these overpriced GPU's cost.
That's cool, though buying a console that's only good for playing games, and then buying all my circa 300 games that I own on Steam again, and playing them with a useless controller instead of WASD is totally out of the question for me. Besides, building a new PC is fun, plugging a box into my TV is boring.

As for the desperation part: I agree. Better to wait than to buy the first released, inferior product.
 
Last edited:

ppn

Joined
Aug 18, 2015
Messages
1,231 (0.39/day)
What are the expectations price performance for RTX3060?

Price ~$350
Perform ~RTX2070

$350 is where usually 60% perf of the $700 xx80 cards lands. like 1060 and 2060 were.

RTX 3060 8G 4864 ~~ RTX 2080/S ~~ 60% of RTX 3081

RTX 3080 10G 8704 new shader is 31% faster than 4352 turing, and the memory is only 23% faster,
so to get the average of 31% the shader must be pulling ahead with being 39% faster ~~6144 shaders fp32 and 2560 int32.
 
  • Like
Reactions: r9

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
15,800 (4.58/day)
Location
Kepler-186f
Maybe they should launch the original 3080 first? I really don't consider meeting less than 1% of the demand a real launch.

Look at my signature. Soon you will join me in the true gaming realm my padawan.
 
Joined
May 2, 2017
Messages
7,762 (3.08/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Because it is not the classic 60-tier GPU. From leaks and details we have so far, both 3070 and 3060 will be based on GA104. When it comes to performance we will have to wait and see but 3060 should be roughly around PS5/XBSX performance level, so it will be enough for a long while. 16GB is more of a marketing argument (and maybe preemptive strike against possible AMD competition).
That makes no sense. What defines a 60-series card? Mainly that it's two tiers down from the 80-series. Which die it's based on, how wide a memory bus it has, etc. is all variable and dependent on factors that come before product segmentation (die yields, production capacity, etc.). Which die the 3060 is based on doesn't matter whatsoever - it's specifications decide performance. Heck, there have been 2060s based on at least three different Turing dice, and they all perform identically in most workloads. The 3060 might obviously be around the XSX performance level (saying "XSX/PS5 performance level" is quite broad given that one is 20% faster than the other), or at least the PS5 performance level, but that still doesn't mean 8GB isn't plenty for it. People really need to stop this idiocy about VRAM usage being the be-all, end-all of performance longevity - that is only true for a select few GPUs throughout history.
PCI-e 4.0 x16 does not really seem to be a bottleneck yet and probably won't be a big one for a long while when we look at how the scaling testing has gone with 3.0 and 2.0. Fitting 4 lanes worth of data shouldn't matter all that much. On the other hand, I think this shader-augmented compression is short-lived - if it proves very useful, compression will move into hardware as it has already supposedly done in consoles.

Moving the storage to be attached to GPU does not really make sense for desktop/gaming use case. More bandwidth through compression and some type of QoS scheme to prioritize data as needed should be enough and this is where it really seems to be moving towards.
I agree that it'll likely move into dedicated hardware, but that hardware is likely to live on the GPU, as that is where the data will be needed. Adding this to CPUs makes little sense - people keep CPUs longer than GPUs, GPUs have (much) bigger power budgets, and for the type of compression in question (specifically DirectStorage-supported algorithms) games are likely to be >99% of the use cases.

As for creating a bottleneck, it's still going to be quite a while until GPUs saturate PCIe 4.0 x16 (or even 3.0 x16), but SSDs use a lot of bandwidth and need to communicate with the entire system, not just the GPU. Sure, the GPU will be what needs the biggest chunks of data the quickest, but chaining an SSD off the GPU still makes far less sense than just keeping it directly attached to the PCIe bus like we do today. That way everything gets near optimal access. The only major improvement over this would be the GPU using the SSD as an expanded memory of sorts (like that oddball Radeon Pro did), but that would mean the system isn't able to use it as storage. And I sincerely doubt people would be particularly willing to add the cost of another multi-TB SSD to their PCs without getting any additional storage in return.
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,909 (2.43/day)
Location
Louisiana -Laissez les bons temps rouler!
System Name Bayou Phantom
Processor Core i7-8700k 4.4Ghz @ 1.18v
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax T40F Black CPU cooler
Memory 2x 16GB Mushkin Redline DDR-4 3200
Video Card(s) EVGA RTX 2080 Ti Xc
Storage 1x 500 MX500 SSD; 2x 6TB WD Black; 1x 4TB WD Black; 1x400GB VelRptr; 1x 4TB WD Blue storage (eSATA)
Display(s) HP 27q 27" IPS @ 2560 x 1440
Case Fractal Design Define R4 Black w/Titanium front -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic X-850
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
Not sure but I thought the idea was to load textures/animations/model vertex data etc into VRAM where it needs to go anyway.
Quite a few games just have lazy devs that load far more into VRAM than necessary to play the game. We are still not at the stage where we NEED that much VRAM.
 
Joined
Jun 13, 2019
Messages
474 (0.27/day)
System Name Fractal
Processor Intel Core i5 13600K
Motherboard Asus ProArt Z790 Creator WiFi
Cooling Arctic Cooling Liquid Freezer II 360
Memory 16GBx2 G.SKILL Ripjaws S5 DDR5 6000 CL30-40-40-96 (F5-6000J3040F16GX2-RS5K)
Video Card(s) PNY RTX A2000 6GB
Storage SK Hynix Platinum P41 2TB
Display(s) LG 34GK950F-B (34"/IPS/1440p/21:9/144Hz/FreeSync)
Case Fractal Design R6 Gunmetal Blackout w/ USB-C
Audio Device(s) Steelseries Arctis 7 Wireless/Klipsch Pro-Media 2.1BT
Power Supply Seasonic Prime 850w 80+ Titanium
Mouse Logitech G700S
Keyboard Corsair K68
Software Windows 11 Pro
Quite a few games just have lazy devs that load far more into VRAM than necessary to play the game. We are still not at the stage where we NEED that much VRAM.

Every thread is going on about this VRAM "issue". Look at the system requirements for Cyberpunk 2077...I don't think we're hitting a wall here anytime soon.
 

ppn

Joined
Aug 18, 2015
Messages
1,231 (0.39/day)
The 3060 might obviously be around the XSX performance level (saying "XSX/PS5 performance level" is quite broad given that one is 20% faster than the other),

3060 6GB 3840 cuda.
3060 8GB 4864 cuda

there you have it, 8GB version is 30% faster,
 
Joined
Sep 17, 2014
Messages
20,780 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
I thought 10GB was enough guys... :)

Guess Nvidia doesn't agree and brings us the real deal after the initial wave of stupid bought the subpar cards.

Well played, Huang.
 
Joined
Apr 30, 2008
Messages
4,875 (0.84/day)
Location
Multidimensional
System Name Boomer Master Race
Processor AMD Ryzen 7 7800X3D 4.2Ghz - 5Ghz CPU
Motherboard MSI B650I Edge Wifi ITX Motherboard
Cooling CM 280mm AIO + 2x 120mm Slim fans
Memory G.Skill Trident Z5 Neo 32GB 6000MHz
Video Card(s) Galax RTX 4060 8GB (Temporary Until Next Gen)
Storage Kingston KC3000 M.2 1TB + 2TB HDD
Display(s) Asus TUF 24Inch 165Hz || AOC 24Inch 180Hz
Case Cooler Master NR200P Max TG ITX Case
Audio Device(s) Built In Realtek Digital Audio HD
Power Supply CoolerMaster V850 SFX Gold 850W PSU
Mouse Logitech G203 Lightsync
Keyboard Atrix RGB Slim Keyboard
VR HMD ( ◔ ʖ̯ ◔ )
Software Windows 10 Home 64bit
Benchmark Scores Don't do them anymore.
If nVidia taught us something with the 20 (Super) series, it's the fact that early buyers get inferior products. That's why I'm going to wait for the 3070 Super/Ti with 16 GB VRAM and a (hopefully) fully unlocked die, unless AMD's RDNA 2 proves to be a huge hit.

This times 100. I'm playing the waiting game, tbh we all are cause no one can get a new card anyways.
 
Joined
Feb 14, 2012
Messages
2,304 (0.52/day)
System Name msdos
Processor 8086
Motherboard mainboard
Cooling passive
Memory 640KB + 384KB extended
Video Card(s) EGA
Storage 5.25"
Display(s) 80x25
Case plastic
Audio Device(s) modchip
Power Supply 45 watts
Mouse serial
Keyboard yes
Software disk commander
Benchmark Scores still running
Need to see 3070S 16GB vs 2080Ti benchmarks.
 
Joined
Oct 19, 2007
Messages
8,185 (1.36/day)
Processor Intel i9 9900K @5GHz w/ Corsair H150i Pro CPU AiO w/Corsair HD120 RBG fan
Motherboard Asus Z390 Maximus XI Code
Cooling 6x120mm Corsair HD120 RBG fans
Memory Corsair Vengeance RBG 2x8GB 3600MHz
Video Card(s) Asus RTX 3080Ti STRIX OC
Storage Samsung 970 EVO Plus 500GB , 970 EVO 1TB, Samsung 850 EVO 1TB SSD, 10TB Synology DS1621+ RAID5
Display(s) Corsair Xeneon 32" 32UHD144 4K
Case Corsair 570x RBG Tempered Glass
Audio Device(s) Onboard / Corsair Virtuoso XT Wireless RGB
Power Supply Corsair HX850w Platinum Series
Mouse Logitech G604s
Keyboard Corsair K70 Rapidfire
Software Windows 11 x64 Professional
Benchmark Scores Firestrike - 23520 Heaven - 3670
3080 20GB is going to be the Ti variant. Calling it now.
 
Joined
Mar 21, 2016
Messages
2,194 (0.75/day)
Why on earth would a 60-tier GPU in 2020 need 16GB of VRAM? Even 8GB is plenty for the types of games and settings that GPU will be capable of handling for its useful lifetime. Shader and RT performance will become a bottleneck at that tier long before 8GB of VRAM does. This is no 1060 3GB.

RAM chip availability is likely the most important part here. The FE PCB only has that many pads for VRAM, so they'd need double density chips, which likely aren't available yet (at least at any type of scale). Given that GDDR6X is a proprietary Micron standard, there's only one supplier, and it would be very weird if Nvidia didn't first use 100% of available capacity to produce the chips that will go with the vast majority of SKUs.

Does that actually make sense, though? That would mean the GPU sharing the PCIe bus with storage for all CPU/RAM accesses to said storage (and either adding some sort of switch to the card, or adding switching/passthrough capabilities to the GPU die), rather than it being used for only data relevant to the GPU. Isn't a huge part of the point of DirectStorage the ability to transfer compressed data directly to the GPU, reducing bandwidth requirements while also offloading the CPU and also shortening the data path significantly? The savings from having the storage hooked directly to the GPU rather than the PC's PCIe bus seem minuscule in comparison to this - unless you're also positing that this on-board storage will have a much wider interface than PCIe 4.0 x4, which would be a whole other can of worms. I guess it might happen (again) for HPC and the like, for those crunching multi-TB datasets, but other than that this seems nigh on impossible both in terms of cost and board space, and impractical in terms of providing actual performance gains.

Btw, the image also lists two 3080 Super SKUs that the news post doesn't mention.
I didn't say 16GB would be practical, but I could still see it happening. To be fair if they could piggy back 12GB to a RTX3060 down the road that would make much more sense in relation to the weaker harder and you might say similar to the RTX3080 situation going from 10GB to 20GB if they could simply piggy back on a few of the chips rather than every GDDR chip and scale the density further that way that's probably a more ideal scenario for everyone involved except Micron.

Because it is not the classic 60-tier GPU. From leaks and details we have so far, both 3070 and 3060 will be based on GA104. When it comes to performance we will have to wait and see but 3060 should be roughly around PS5/XBSX performance level, so it will be enough for a long while. 16GB is more of a marketing argument (and maybe preemptive strike against possible AMD competition).
Agreed it's more marketing than practicality, but that said it's a new generation GPU with new capabilities so while it might be more anemic and stretched thin in resources for that amount of VRAM perhaps newer hardware is at least more capable of utilizing it by intelligently managing it with other techniques DLSS, VRS, mesh shading, ect, but we'll see the RTX3060 isn't even out yet so we don't know nearly enough to conclude what it'll be like entirely. GPU's are becoming more flexible at managing resources every generation though on the plus side.
 
Last edited:
Joined
Jan 5, 2006
Messages
17,693 (2.66/day)
System Name AlderLake / Laptop
Processor Intel i7 12700K P-Cores @ 5Ghz / Intel i3 7100U
Motherboard Gigabyte Z690 Aorus Master / HP 83A3 (U3E1)
Cooling Noctua NH-U12A 2 fans + Thermal Grizzly Kryonaut Extreme + 5 case fans / Fan
Memory 32GB DDR5 Corsair Dominator Platinum RGB 6000MHz CL36 / 8GB DDR4 HyperX CL13
Video Card(s) MSI RTX 2070 Super Gaming X Trio / Intel HD620
Storage Samsung 980 Pro 1TB + 970 Evo 500GB + 850 Pro 512GB + 860 Evo 1TB x2 / Samsung 256GB M.2 SSD
Display(s) 23.8" Dell S2417DG 165Hz G-Sync 1440p / 14" 1080p IPS Glossy
Case Be quiet! Silent Base 600 - Window / HP Pavilion
Audio Device(s) Panasonic SA-PMX94 / Realtek onboard + B&O speaker system / Harman Kardon Go + Play / Logitech G533
Power Supply Seasonic Focus Plus Gold 750W / Powerbrick
Mouse Logitech MX Anywhere 2 Laser wireless / Logitech M330 wireless
Keyboard RAPOO E9270P Black 5GHz wireless / HP backlit
Software Windows 11 / Windows 10
Benchmark Scores Cinebench R23 (Single Core) 1936 @ stock Cinebench R23 (Multi Core) 23006 @ stock
by piggy-backing two of these chips per 32-bit channel (chips on either side of the PCB).

How hot would these chips get with some thermal pads and just a backplate on top of it.
 
Joined
Sep 10, 2020
Messages
60 (0.05/day)
Answer from u/NV_Tim, Community Manager from NVIDIA GeForce Global Community Team

Why only 10 GB of memory for RTX 3080? How was that determined to be a sufficient number, when it is stagnant from the previous generation?
-
Justin Walker, Director of GeForce product management

We’re constantly analyzing memory requirements of the latest games and regularly review with game developers to understand their memory needs for current and upcoming games. The goal of 3080 is to give you great performance at up to 4k resolution with all the settings maxed out at the best possible price.

In order to do this, you need a very powerful GPU with high speed memory and enough memory to meet the needs of the games. A few examples. If you look at Shadow of the Tomb Raider, Assassin’s Creed Odyssey, Metro Exodus, Wolfenstein Youngblood, Gears of War 5, Borderlands 3 and Red Dead Redemption 2 running on a 3080 at 4k with Max settings (including any applicable high res texture packs) and RTX On, when the game supports it, you get in the range of 60-100fps and use anywhere from 4GB to 6GB of memory.

Extra memory is always nice to have but it would increase the price of the graphics card, so we need to find the right balance.
-
 
Last edited:

ppn

Joined
Aug 18, 2015
Messages
1,231 (0.39/day)
if MVDCC power shows 70 watt on 3080, this is 7 watts per chip, hard to cool, 3090 i think those are higher density not piggy.

According to spec 3070S should land around 70% of 3080, 2080Ti being 76%. pretty close.
 
Last edited:
Joined
Apr 13, 2009
Messages
230 (0.04/day)
System Name NERV
Processor AMD Ryzen 5 2600X
Motherboard ASROCK B450M Steel Legend
Cooling Artic Freezer 33 One + 2x Akasa 140mm
Memory 2x8 Crucial Ballistix Sport 2993 MHz
Video Card(s) KFA2 GeForce RTX 3060 Ti
Storage Crucial Mx500 500GB + 1TB HDD
Display(s) Samsung C34H892
Case CM Masterbox Q300L
Audio Device(s) ALC892 + Topping D30
Power Supply Corsair RM650
Mouse CM Mastermouse Lite S
Keyboard Logitech G510
Software Win 10 Pro x64
Benchmark Scores No bech, only game!
Joined
Jul 7, 2019
Messages
821 (0.48/day)
I feel the Radeon SSG prototype helped inform the direct access capability that both upcoming consoles use in slightly different ways, as well as a future possible interim-upgrade path on theoretical mid-high end GPU models in both the gaming and professional areas. Professionally, the SSG showed it can both be used as extra storage on top of being used as an ultra-fast scratch drive, according to Anandtech's article, with only the main hurdle being getting software devs to incorporate the necessary API stuff. It could be a neat feature to install your high-end games onto the GPU drive or save your project to said drive, and let the GPU load it direct from there and effectively "stream" the project/game assets in realtime.

I could see a future Radeon X700+ series and NVIDIA X070+ series of GPUs and their professional equivalents incorporating an option to install an nVME PCIe 4.0 (or 5.0, since that's tech supposedly due late next year or 2022 and expected to last for quite awhile) onto the card, as a way to boost available memory for either professional or gaming purposes. Or maybe Intel could beat the competition to market, using Optane add-ons to their own respective GPUs, acting more like reserve VRAM expansion thanks to higher Read/Write performance than typical NVMe, but less than GDDR/HBM.
 
Joined
Dec 30, 2010
Messages
2,082 (0.43/day)
The logical next step to DirectStorage and RTX-IO is graphics cards with resident non-volatile storage (i.e. SSDs on the card). I think there have already been some professional cards with onboard storage (though not leveraging tech like DirectStorage). You install optimized games directly onto this resident storage, and the GPU has its own downstream PCIe root complex that deals with it.

So I expect RTX 40-series "Hopper" to have 2 TB ~ 8 TB variants.

Dont work like that. It's only menth / designed as a cache without streaming the data from SSD/HDD/memory over the PCI-E bus.

but i'm sure you could utilitize all that memory one day, as some storage.
 
Joined
Mar 21, 2016
Messages
2,194 (0.75/day)
How hot would these chips get with some thermal pads and just a backplate on top of it.
Who knows I'm sure if they get real hot the backplate would be a obvious indicator. Ram isn't generally particularly hot though in the first place. It's not like that can't be resolved trivially connecting it with some heat pipe cooling to the bottom heat sink cooling.
I feel the Radeon SSG prototype helped inform the direct access capability that both upcoming consoles use in slightly different ways, as well as a future possible interim-upgrade path on theoretical mid-high end GPU models in both the gaming and professional areas. Professionally, the SSG showed it can both be used as extra storage on top of being used as an ultra-fast scratch drive, according to Anandtech's article, with only the main hurdle being getting software devs to incorporate the necessary API stuff. It could be a neat feature to install your high-end games onto the GPU drive or save your project to said drive, and let the GPU load it direct from there and effectively "stream" the project/game assets in realtime.

I could see a future Radeon X700+ series and NVIDIA X070+ series of GPUs and their professional equivalents incorporating an option to install an nVME PCIe 4.0 (or 5.0, since that's tech supposedly due late next year or 2022 and expected to last for quite awhile) onto the card, as a way to boost available memory for either professional or gaming purposes. Or maybe Intel could beat the competition to market, using Optane add-ons to their own respective GPUs, acting more like reserve VRAM expansion thanks to higher Read/Write performance than typical NVMe, but less than GDDR/HBM.
I already thought about the Optane thing. AMD could just counter that with a DDR DIMM or LPDDR DIMM combined with a microSD card and doing ram disk backups to the non-volatile storage leveraging a integrated CPU chip which could also do compression/decompression as well which further improves performance. Optane is a fair amount slower by comparison to even the DDR option. Optane is cheaper per gigabyte than DDR, but it doesn't require much DDR to speed up a whole lot of memory is the reality of it you're mostly confined by the interface hence why the Gigabyte I-RAM was kind of a dud on performance relative to what you would have hoped for utilizing the memory and that's also sort of a bit limitation to HDD's on board cache performance as well it too is limited by that same interface protocol it's attached to NVMe is infinately better than SATA III in that regard especially on PCIE 4.0 unfortunately HDD's have pretty near ceased to innovate since SSD's took over on the performance side.
 
Top