• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Readies RTX 3060 8GB and RTX 3080 20GB Models

Joined
Nov 11, 2016
Messages
603 (0.42/day)
System Name The de-ploughminator
Processor I7 8700K @ 5.1Ghz
Motherboard Gigabyte Z370 Gaming 5
Cooling Custom Watercooling
Memory 4x8GB G.Skill Trident Neo 3600mhz 15-15-15-30
Video Card(s) RTX 2080 Ti + Heatkiller IV wb
Storage Plextor 512GB nvme SSD
Display(s) LG 34GN850-B
Case Lian Li 011D Dynamic
Audio Device(s) Creative AE-5
Power Supply Corsair RM1000
AMD buyer's cycle: "I'm waiting for the next generation" [New product launches] "Seems nice, but I'll wait for the next one which will be even better"

Nvidia buyer's cycle: "I NEED IT RIGHT NOW, GIMME GIMME" [New product launches] "SHIT, I SHOULD'VE WAITED FOR THE BETTER VERSION"
Perfect summary, and that's why Nvidia is so filthy rich right now :respect:
 
Joined
Jul 5, 2013
Messages
10,891 (4.07/day)
System Name GPD-Q9
Processor Rockchip RK-3288 1.8ghz quad core
Motherboard GPD Q9_V6_150528
Cooling Passive
Memory 2GB DDR3
Video Card(s) Mali T764
Storage 16GB Samsung NAND
Display(s) IPS 1024x600
It's not listed, but could see a RTX 3060S 16GB being possible as well eventually.
That would be interesting.

Whatever happened to just giving the different levelled tiers of GPUs normal amounts of VRAM... 1, 2, 4, 8, 16, 32GB etc.. I don't get it why we need 10GB or 20GB....
With you on that one. I want a 16GB model of the 3080. I don't mind if it's only 256bit memory bus. A 16GB 3070 would also be good.
 

r9

Joined
Jul 28, 2008
Messages
2,687 (0.60/day)
System Name PC1| PC2|Poweredge r410
Processor i5 6600k| Ryzen 1600| 2 x E5620 @2.4GHz
Memory 16GB DDR4 |16GB DDR4 | 32GB ECC DDR3
Video Card(s) GTX 1070|2 x RX570 |On-Board
Storage 512GB SSD+1TB SSD|512GB SSD+1TB|2x256GBSSD 2x2TBGB
Display(s) 27" Dell + 2 x 24" LCD Setup
Software Windows 10 |Windows 10| Server 2012 r2
What are the expectations price performance for RTX3060?

Price ~$350
Perform ~RTX2070
 
Last edited:
Joined
Jan 8, 2017
Messages
5,893 (4.24/day)
System Name Good enough
Processor AMD Ryzen R7 1700X - 4.0 Ghz / 1.350V
Motherboard ASRock B450M Pro4
Cooling Deepcool Gammaxx L240 V2
Memory 16GB - Corsair Vengeance LPX - 3333 Mhz CL16
Video Card(s) OEM Dell GTX 1080 with Kraken G12 + Water 3.0 Performer C
Storage 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) 4K Samsung TV
Case Deepcool Matrexx 70
Power Supply GPS-750C
A 20GB 3080 would definitely be more enticing but I don't want to find out what the price would be. By the way, 12GB is much more likely than 20.

Whatever happened to just giving the different levelled tiers of GPUs normal amounts of VRAM... 1, 2, 4, 8, 16, 32GB etc.. I don't get it why we need 10GB or 20GB....
You can't just put any memory configuration, the VRAM capacity is linked with that the GPU's memory controllers and bus interfaces can do.
 
Joined
Jan 14, 2019
Messages
266 (0.41/day)
Location
United Kingdom
System Name Rainbow Headcrab
Processor pending
Motherboard pending
Cooling pending
Memory pending
Video Card(s) pending
Storage 512 GB ADATA SU900, 2 TB Seagate Barracuda 2.5"
Display(s) Samsung C24F396
Case AeroCool Aero One Mini Eclipse
Audio Device(s) Genius SP-HF160 speakers, AKG Y50 headphones
Power Supply Seasonic Prime Ultra Platinum 550W
Mouse Logitech M535
Keyboard MagicForce 68
Software Windows 10
Are people that desperate they can't wait another month to see what AMD have to offer? Nvidia have a history of mugging off their customers, day one purchasers are setting themselves up for buyers remorse.

Give it a month or two and who knows, Turing cards may start tumbling or AMD could knock it out the park. Personally i'm just being sensible and buying a PS5 for the same cost as what one of these overpriced GPU's cost.
That's cool, though buying a console that's only good for playing games, and then buying all my circa 300 games that I own on Steam again, and playing them with a useless controller instead of WASD is totally out of the question for me. Besides, building a new PC is fun, plugging a box into my TV is boring.

As for the desperation part: I agree. Better to wait than to buy the first released, inferior product.
 
Last edited:

ppn

Joined
Aug 18, 2015
Messages
910 (0.48/day)
What are the expectations price performance for RTX3060?

Price ~$350
Perform ~RTX2070
$350 is where usually 60% perf of the $700 xx80 cards lands. like 1060 and 2060 were.

RTX 3060 8G 4864 ~~ RTX 2080/S ~~ 60% of RTX 3081

RTX 3080 10G 8704 new shader is 31% faster than 4352 turing, and the memory is only 23% faster,
so to get the average of 31% the shader must be pulling ahead with being 39% faster ~~6144 shaders fp32 and 2560 int32.
 
  • Like
Reactions: r9
Joined
Oct 17, 2014
Messages
4,998 (2.27/day)
Location
USA
System Name Paladius Tacet
Processor Ryzen 5900x
Motherboard MSI X570 Tomahawk
Cooling Arctic Freezer 34 DUO
Memory G.Skill 2x16 (32gb) 4000 cas 16-19-19-39 @ 1.42v 1:1
Video Card(s) Big Navi Top Tier
Storage Samsung 2TB SSD
Display(s) ViewSonic VX2768-2KPC-MHD VA 27" 1440p 144hz
Case Corsair 110Q Silent + NZXT Aer-P exhaust fan
Power Supply EVGA 700w Gold
Mouse Logitech G502 Hero SE
Keyboard Logitech Cherry Mx Red
Maybe they should launch the original 3080 first? I really don't consider meeting less than 1% of the demand a real launch.
Look at my signature. Soon you will join me in the true gaming realm my padawan.
 
Joined
May 2, 2017
Messages
2,989 (2.34/day)
Processor AMD Ryzen 5 1600X
Motherboard Biostar X370GTN
Cooling Custom CPU+GPU water loop
Memory 16GB G.Skill TridentZ DDR4-3200 C16
Video Card(s) AMD R9 Fury X
Storage 500GB 960 Evo (OS ++), 500GB 850 Evo (Games)
Display(s) Dell U2711
Case NZXT H200i
Power Supply EVGA Supernova G2 750W
Mouse Logitech G602
Keyboard Lenovo Compact Keyboard with Trackpoint
Software Windows 10 Pro
Because it is not the classic 60-tier GPU. From leaks and details we have so far, both 3070 and 3060 will be based on GA104. When it comes to performance we will have to wait and see but 3060 should be roughly around PS5/XBSX performance level, so it will be enough for a long while. 16GB is more of a marketing argument (and maybe preemptive strike against possible AMD competition).
That makes no sense. What defines a 60-series card? Mainly that it's two tiers down from the 80-series. Which die it's based on, how wide a memory bus it has, etc. is all variable and dependent on factors that come before product segmentation (die yields, production capacity, etc.). Which die the 3060 is based on doesn't matter whatsoever - it's specifications decide performance. Heck, there have been 2060s based on at least three different Turing dice, and they all perform identically in most workloads. The 3060 might obviously be around the XSX performance level (saying "XSX/PS5 performance level" is quite broad given that one is 20% faster than the other), or at least the PS5 performance level, but that still doesn't mean 8GB isn't plenty for it. People really need to stop this idiocy about VRAM usage being the be-all, end-all of performance longevity - that is only true for a select few GPUs throughout history.
PCI-e 4.0 x16 does not really seem to be a bottleneck yet and probably won't be a big one for a long while when we look at how the scaling testing has gone with 3.0 and 2.0. Fitting 4 lanes worth of data shouldn't matter all that much. On the other hand, I think this shader-augmented compression is short-lived - if it proves very useful, compression will move into hardware as it has already supposedly done in consoles.

Moving the storage to be attached to GPU does not really make sense for desktop/gaming use case. More bandwidth through compression and some type of QoS scheme to prioritize data as needed should be enough and this is where it really seems to be moving towards.
I agree that it'll likely move into dedicated hardware, but that hardware is likely to live on the GPU, as that is where the data will be needed. Adding this to CPUs makes little sense - people keep CPUs longer than GPUs, GPUs have (much) bigger power budgets, and for the type of compression in question (specifically DirectStorage-supported algorithms) games are likely to be >99% of the use cases.

As for creating a bottleneck, it's still going to be quite a while until GPUs saturate PCIe 4.0 x16 (or even 3.0 x16), but SSDs use a lot of bandwidth and need to communicate with the entire system, not just the GPU. Sure, the GPU will be what needs the biggest chunks of data the quickest, but chaining an SSD off the GPU still makes far less sense than just keeping it directly attached to the PCIe bus like we do today. That way everything gets near optimal access. The only major improvement over this would be the GPU using the SSD as an expanded memory of sorts (like that oddball Radeon Pro did), but that would mean the system isn't able to use it as storage. And I sincerely doubt people would be particularly willing to add the cost of another multi-TB SSD to their PCs without getting any additional storage in return.
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,375 (2.99/day)
Location
Louisiana -Laissez les bons temps rouler!
System Name Bayou Phantom
Processor Core i7-8700k 4.4Ghz @ 1.18v
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax T40F Black CPU cooler
Memory 2x 16GB Mushkin Redline DDR-4 3200
Video Card(s) EVGA RTX 2080 Ti Xc
Storage 1x 500 MX500 SSD; 1x 6TB WD Black; 2x 4TB WD Black; 1x400GB VelRptr; 1x 3TB WD Blue storage (eSATA)
Display(s) HP 27q 27" IPS @ 2560 x 1440
Case Fractal Design Define R4 Black w/Titanium front -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic X-850
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
Not sure but I thought the idea was to load textures/animations/model vertex data etc into VRAM where it needs to go anyway.
Quite a few games just have lazy devs that load far more into VRAM than necessary to play the game. We are still not at the stage where we NEED that much VRAM.
 
Joined
Jun 13, 2019
Messages
201 (0.40/day)
System Name Fractal
Processor AMD Ryzen 5 3600X
Motherboard Asus ROG STRIX B450-F Gaming
Cooling Noctua NH-D15 Chromax.Black
Memory 2x8 Corsair Vengeance LPX 3600Mhz CL18 (CMK16GX4M2D3600C18)
Video Card(s) Sapphire Pulse Radeon RX 5700XT
Storage ADATA XPG SX8200 PRO 2TB + multiple SATA Rust spinners
Display(s) LG 34GK950F-B
Case Fractal Design R6 Gunmetal Blackout w/ USB-C
Audio Device(s) Steelseries Arctis 7 Wireless
Power Supply Seasonic Prime 850w 80+ Titanium
Mouse Logitech G502 Hero SE
Keyboard Generic Dell
Software Win10 Pro
Quite a few games just have lazy devs that load far more into VRAM than necessary to play the game. We are still not at the stage where we NEED that much VRAM.
Every thread is going on about this VRAM "issue". Look at the system requirements for Cyberpunk 2077...I don't think we're hitting a wall here anytime soon.
 

ppn

Joined
Aug 18, 2015
Messages
910 (0.48/day)
The 3060 might obviously be around the XSX performance level (saying "XSX/PS5 performance level" is quite broad given that one is 20% faster than the other),
3060 6GB 3840 cuda.
3060 8GB 4864 cuda

there you have it, 8GB version is 30% faster,
 
Joined
Sep 17, 2014
Messages
13,594 (6.09/day)
Location
The Kitchen Table
Processor i7 8700k 4.7Ghz @ 1.26v
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) MSI GTX 1080 Gaming X @ 2100/5500
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Eizo Foris FG2421
Case Fractal Design Define C TG
Power Supply EVGA G2 750w
Mouse Logitech G502 Protheus Spectrum
Keyboard Sharkoon MK80 (Brown)
Software W10 x64
I thought 10GB was enough guys... :)

Guess Nvidia doesn't agree and brings us the real deal after the initial wave of stupid bought the subpar cards.

Well played, Huang.
 
Joined
Apr 30, 2008
Messages
4,551 (1.00/day)
Location
Multidimensional
System Name PC Mustard Race
Processor AMD Ryzen 5 4600H
Motherboard Asus Laptop OEM AM4
Cooling Internal Laptop Cooling
Memory 16GB Crucial 3200Mhz
Video Card(s) GTX 1650 Ti 4GB + Intergrated Vega GPU
Storage 512GB Crucial M.2 / 2TB Seagate 2.5in HDD
Display(s) 144hz Laptop IPS Panel 1080p
Case Laptop Chassis
Audio Device(s) Realtek Laptop Trash
Power Supply Power brick
Mouse CoolerMaster Masterkeys Lite L RGB Mem-Chanical Combo + Touch Pad
Keyboard Laptop RGB KB
Software Windows 10 Home 64bit
Benchmark Scores Don't do em anymore.
If nVidia taught us something with the 20 (Super) series, it's the fact that early buyers get inferior products. That's why I'm going to wait for the 3070 Super/Ti with 16 GB VRAM and a (hopefully) fully unlocked die, unless AMD's RDNA 2 proves to be a huge hit.
This times 100. I'm playing the waiting game, tbh we all are cause no one can get a new card anyways.
 
Joined
Feb 14, 2012
Messages
2,038 (0.64/day)
System Name msdos
Processor 8086
Motherboard mainboard
Cooling passive
Memory 640KB + 384KB extended
Video Card(s) EGA
Storage 5.25"
Display(s) 80x25
Case plastic
Audio Device(s) modchip
Power Supply 45 watts
Mouse serial
Keyboard yes
Software disk commander
Benchmark Scores still running
Need to see 3070S 16GB vs 2080Ti benchmarks.
 
Joined
Oct 19, 2007
Messages
7,537 (1.58/day)
Processor Intel i9 9900K @5GHz w/ Corsair H150i Pro CPU AiO w/Corsair HD120 RBG fan
Motherboard Asus Z390 Maximus XI Code
Cooling 6x120mm Corsair HD120 RBG fans
Memory Corsair Vengeance RBG 2x8GB 3600MHz
Video Card(s) Asus RTX 2080 STRIX OC
Storage Samsung 250GB 960 EVO m.2, 970 EVO 1TB, Samsung 850 EVO 1TB SSD, 10TB Synology DS1515+ RAID5
Display(s) Acer Predator 34" 3440x1440 OC'd to 100MHz
Case Corsair 570x RBG Tempered Glass
Audio Device(s) Onboard / Corsair Void Wireless RGB
Power Supply Corsair HX850w Platinum Series
Mouse Logitech G602s
Keyboard Corsair K70 Rapidfire
Software Windows 10 x64 Professional
Benchmark Scores Firestrike - 23520 Heaven - 3670
3080 20GB is going to be the Ti variant. Calling it now.
 
Joined
Mar 21, 2016
Messages
604 (0.36/day)
Why on earth would a 60-tier GPU in 2020 need 16GB of VRAM? Even 8GB is plenty for the types of games and settings that GPU will be capable of handling for its useful lifetime. Shader and RT performance will become a bottleneck at that tier long before 8GB of VRAM does. This is no 1060 3GB.

RAM chip availability is likely the most important part here. The FE PCB only has that many pads for VRAM, so they'd need double density chips, which likely aren't available yet (at least at any type of scale). Given that GDDR6X is a proprietary Micron standard, there's only one supplier, and it would be very weird if Nvidia didn't first use 100% of available capacity to produce the chips that will go with the vast majority of SKUs.

Does that actually make sense, though? That would mean the GPU sharing the PCIe bus with storage for all CPU/RAM accesses to said storage (and either adding some sort of switch to the card, or adding switching/passthrough capabilities to the GPU die), rather than it being used for only data relevant to the GPU. Isn't a huge part of the point of DirectStorage the ability to transfer compressed data directly to the GPU, reducing bandwidth requirements while also offloading the CPU and also shortening the data path significantly? The savings from having the storage hooked directly to the GPU rather than the PC's PCIe bus seem minuscule in comparison to this - unless you're also positing that this on-board storage will have a much wider interface than PCIe 4.0 x4, which would be a whole other can of worms. I guess it might happen (again) for HPC and the like, for those crunching multi-TB datasets, but other than that this seems nigh on impossible both in terms of cost and board space, and impractical in terms of providing actual performance gains.

Btw, the image also lists two 3080 Super SKUs that the news post doesn't mention.
I didn't say 16GB would be practical, but I could still see it happening. To be fair if they could piggy back 12GB to a RTX3060 down the road that would make much more sense in relation to the weaker harder and you might say similar to the RTX3080 situation going from 10GB to 20GB if they could simply piggy back on a few of the chips rather than every GDDR chip and scale the density further that way that's probably a more ideal scenario for everyone involved except Micron.

Because it is not the classic 60-tier GPU. From leaks and details we have so far, both 3070 and 3060 will be based on GA104. When it comes to performance we will have to wait and see but 3060 should be roughly around PS5/XBSX performance level, so it will be enough for a long while. 16GB is more of a marketing argument (and maybe preemptive strike against possible AMD competition).
Agreed it's more marketing than practicality, but that said it's a new generation GPU with new capabilities so while it might be more anemic and stretched thin in resources for that amount of VRAM perhaps newer hardware is at least more capable of utilizing it by intelligently managing it with other techniques DLSS, VRS, mesh shading, ect, but we'll see the RTX3060 isn't even out yet so we don't know nearly enough to conclude what it'll be like entirely. GPU's are becoming more flexible at managing resources every generation though on the plus side.
 
Last edited:
Joined
Jan 5, 2006
Messages
10,779 (1.99/day)
System Name Desktop / Laptop
Processor Intel i7 6700K @ 4.5GHz (1.270 V) / Intel i3 7100U
Motherboard Asus Z170 Pro Gaming / HP 83A3 (U3E1)
Cooling Noctua NH-U12A 2 fans + Thermal Grizzly Kryonaut + 5 case fans / Fan
Memory 16GB DDR4 Corsair Vengeance LPX 3000MHz CL15 / 8GB DDR4 HyperX CL13
Video Card(s) MSI RTX 2070 Super Gaming X Trio / Intel HD620
Storage Samsung 970 Evo 500GB + Samsung 850 Pro 512GB + Samsung 860 Evo 1TB / Samsung 256GB M.2 SSD
Display(s) 23.8" Dell S2417DG 165Hz G-Sync 1440p + 21.5" LG 22MP67VQ IPS 60Hz 1080p / 14" 1080p IPS Glossy
Case Be quiet! Silent Base 600 - Window / HP Pavilion
Audio Device(s) SupremeFX Onboard / Realtek onboard + B&O speaker system
Power Supply Seasonic Focus Plus Gold 750W / Powerbrick
Mouse Logitech MX Anywhere 2 Laser wireless / Logitech M330 wireless
Keyboard RAPOO E9270P Black 5GHz wireless / HP backlit
Software Windows 10 / Windows 10
by piggy-backing two of these chips per 32-bit channel (chips on either side of the PCB).
How hot would these chips get with some thermal pads and just a backplate on top of it.
 
Joined
Sep 10, 2020
Messages
13 (0.27/day)
System Name Battlestation
Processor Intel i9-9900K @ 5 GHz
Motherboard Asus Maximus XI Hero
Cooling Corsair H115i Platinum, 2x ML140 PRO
Memory G.Skill Trident Z DDR4-4000 CL17 16 GB
Video Card(s) Asus ROG Strix 2080 Ti O11G
Storage Samsung NVMe SSD 960 PRO 1 TB, NVMe SSD 970 EVO 2 TB
Display(s) Asus ROG Swift PG279Q IPS 1440p 165 Hz G-sync
Case Corsair Crystal 680X
Audio Device(s) Sennheiser GSX 1000 and Game Zero, Onkyo HTX-22HDX
Power Supply Corsair HX850i
Mouse Roccat Cone Pure Ultra
Keyboard Roccat Suora
Software Windows 10 Pro 2010
Answer from u/NV_Tim, Community Manager from NVIDIA GeForce Global Community Team

Why only 10 GB of memory for RTX 3080? How was that determined to be a sufficient number, when it is stagnant from the previous generation?
-
Justin Walker, Director of GeForce product management

We’re constantly analyzing memory requirements of the latest games and regularly review with game developers to understand their memory needs for current and upcoming games. The goal of 3080 is to give you great performance at up to 4k resolution with all the settings maxed out at the best possible price.

In order to do this, you need a very powerful GPU with high speed memory and enough memory to meet the needs of the games. A few examples. If you look at Shadow of the Tomb Raider, Assassin’s Creed Odyssey, Metro Exodus, Wolfenstein Youngblood, Gears of War 5, Borderlands 3 and Red Dead Redemption 2 running on a 3080 at 4k with Max settings (including any applicable high res texture packs) and RTX On, when the game supports it, you get in the range of 60-100fps and use anywhere from 4GB to 6GB of memory.

Extra memory is always nice to have but it would increase the price of the graphics card, so we need to find the right balance.
-
 
Last edited:

ppn

Joined
Aug 18, 2015
Messages
910 (0.48/day)
if MVDCC power shows 70 watt on 3080, this is 7 watts per chip, hard to cool, 3090 i think those are higher density not piggy.

According to spec 3070S should land around 70% of 3080, 2080Ti being 76%. pretty close.
 
Last edited:
Joined
Apr 13, 2009
Messages
210 (0.05/day)
System Name NERV
Processor AMD Ryzen 5 2600X
Motherboard ASROCK B450M Steel Legend
Cooling Artic Freezer 33 One + 2x Akasa 140mm
Memory 2x8 Crucial Ballistix Sport 2993 MHz
Video Card(s) [Waiting for Ampere/RDNA2]
Storage Crucial Mx500 500GB + 1TB HDD
Display(s) Samsung C34H892
Case CM Masterbox Q300L
Audio Device(s) ALC892 + Creative Sound Blaster Z
Power Supply Corsair RM650
Mouse CM Mastermouse Lite S
Keyboard Logitech G510
Software Win 10 Pro x64
Benchmark Scores No bech, only game!
Joined
Jul 7, 2019
Messages
136 (0.28/day)
I feel the Radeon SSG prototype helped inform the direct access capability that both upcoming consoles use in slightly different ways, as well as a future possible interim-upgrade path on theoretical mid-high end GPU models in both the gaming and professional areas. Professionally, the SSG showed it can both be used as extra storage on top of being used as an ultra-fast scratch drive, according to Anandtech's article, with only the main hurdle being getting software devs to incorporate the necessary API stuff. It could be a neat feature to install your high-end games onto the GPU drive or save your project to said drive, and let the GPU load it direct from there and effectively "stream" the project/game assets in realtime.

I could see a future Radeon X700+ series and NVIDIA X070+ series of GPUs and their professional equivalents incorporating an option to install an nVME PCIe 4.0 (or 5.0, since that's tech supposedly due late next year or 2022 and expected to last for quite awhile) onto the card, as a way to boost available memory for either professional or gaming purposes. Or maybe Intel could beat the competition to market, using Optane add-ons to their own respective GPUs, acting more like reserve VRAM expansion thanks to higher Read/Write performance than typical NVMe, but less than GDDR/HBM.
 
Joined
Dec 30, 2010
Messages
1,087 (0.30/day)
The logical next step to DirectStorage and RTX-IO is graphics cards with resident non-volatile storage (i.e. SSDs on the card). I think there have already been some professional cards with onboard storage (though not leveraging tech like DirectStorage). You install optimized games directly onto this resident storage, and the GPU has its own downstream PCIe root complex that deals with it.

So I expect RTX 40-series "Hopper" to have 2 TB ~ 8 TB variants.
Dont work like that. It's only menth / designed as a cache without streaming the data from SSD/HDD/memory over the PCI-E bus.

but i'm sure you could utilitize all that memory one day, as some storage.
 
Joined
Mar 21, 2016
Messages
604 (0.36/day)
How hot would these chips get with some thermal pads and just a backplate on top of it.
Who knows I'm sure if they get real hot the backplate would be a obvious indicator. Ram isn't generally particularly hot though in the first place. It's not like that can't be resolved trivially connecting it with some heat pipe cooling to the bottom heat sink cooling.
I feel the Radeon SSG prototype helped inform the direct access capability that both upcoming consoles use in slightly different ways, as well as a future possible interim-upgrade path on theoretical mid-high end GPU models in both the gaming and professional areas. Professionally, the SSG showed it can both be used as extra storage on top of being used as an ultra-fast scratch drive, according to Anandtech's article, with only the main hurdle being getting software devs to incorporate the necessary API stuff. It could be a neat feature to install your high-end games onto the GPU drive or save your project to said drive, and let the GPU load it direct from there and effectively "stream" the project/game assets in realtime.

I could see a future Radeon X700+ series and NVIDIA X070+ series of GPUs and their professional equivalents incorporating an option to install an nVME PCIe 4.0 (or 5.0, since that's tech supposedly due late next year or 2022 and expected to last for quite awhile) onto the card, as a way to boost available memory for either professional or gaming purposes. Or maybe Intel could beat the competition to market, using Optane add-ons to their own respective GPUs, acting more like reserve VRAM expansion thanks to higher Read/Write performance than typical NVMe, but less than GDDR/HBM.
I already thought about the Optane thing. AMD could just counter that with a DDR DIMM or LPDDR DIMM combined with a microSD card and doing ram disk backups to the non-volatile storage leveraging a integrated CPU chip which could also do compression/decompression as well which further improves performance. Optane is a fair amount slower by comparison to even the DDR option. Optane is cheaper per gigabyte than DDR, but it doesn't require much DDR to speed up a whole lot of memory is the reality of it you're mostly confined by the interface hence why the Gigabyte I-RAM was kind of a dud on performance relative to what you would have hoped for utilizing the memory and that's also sort of a bit limitation to HDD's on board cache performance as well it too is limited by that same interface protocol it's attached to NVMe is infinately better than SATA III in that regard especially on PCIE 4.0 unfortunately HDD's have pretty near ceased to innovate since SSD's took over on the performance side.
 
Top