• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

So is the 10 gb vram on rtx 3080 still enough?

Status
Not open for further replies.
Joined
Dec 31, 2009
Messages
19,366 (3.70/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
This is just my observation from watching too many Digital Foundry comparison videos but there were few misses such as better graphics and/or rez but worse frame rate than the original console. Sure enough, that was in quality mode, not performance mode. My guess is, it depends on how well each game is optimized as well as developer choice of prioritizing rez/graphics over frame rate & vice-versa.
So...snake oil, really. Just another reason to sell something 'better'. Not a need/requirement for improvement.:(
 
  • Like
Reactions: Rei

Rei

Joined
Aug 1, 2020
Messages
656 (0.48/day)
Location
Guam
System Name 1 Desktop/2 Laptops/1 Netbook
Processor AMD Athon X2 64/Intel Pentium 997/Intel Pentium 4/Intel Atom
Motherboard EpoX ATX motherboard/Samsung/Toshiba/Lenovo
Cooling Stock
Memory 4 GB/4 GB/2 GB/2 GB
Video Card(s) Asus GeForce GTX 780 Ti/Intel HD Graphics/GeForce 4MX/Intel GMA
Storage 6+ TB Total
Display(s) HP Pavilion 14 Inch 1024x768@60Hz 4:3 Aspect Ratio CRT Monitor
Case None
Audio Device(s) Various
Power Supply Seasonic 500 Watt & VenomRX 500 Watt
Mouse Wayes Iron Man Wireless Mouse
Keyboard Rexus VR2 Wireless Keyboard
Software Win10 & WinXP SP3
Benchmark Scores It sucks...
So...snake oil, really. Just another reason to sell something 'better'. Not a need/requirement for improvement.:(
Yup, totally with you.

It just dawn on me on why 8K isn't gonna be a thing for a while is storage space & pricing. The price of high capacity storage device needs to go down hard to be able to take in more 8K content as those takes up several times more space than 4K content.
No easily affordable large storage drive means lack of 8K adoption which means no mid-gen refresh.
 
Joined
Feb 20, 2019
Messages
7,342 (3.86/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Why do you think that there will be a refresh of PS5 & Xbox 4 in less than 2 years? The only reason that there was a refresh of last gen console was to take advantage of the sudden demand in 4K market segment while maintaining status quo for 1080p. I highly doubt that there will be a 6K market while I don't see 8K content becoming commonplace in the next 4-5 years that there will be a need for mid-gen refresh. By that time, both (or all 3) companies would have already been working on the next-gen console.

Because it has happened with previous generations and for the current generation it's shown on the roadmap already. It's not that it might happen, it will definitely happen. Timing is a bit of a guess, but that's really all that's uncertain.

Look at it from Microsoft and Sony's perspective, in a year from now AMD and TSMC will be able to produce the same silicon design on a cheaper process that also requires less power and cooling enabling a cheaper console. That lets them sell the "same" console at a lower entry price, enticing more consumers into their console ecosystem and making them more profit. At the same time, they can offer an upgraded console that delivers 4K60 where the previous console maybe only did 4K30 and hardcore gamers will buy that console as an upgrade, which is yet another sale for the console manufacturer and potentially one more console passed onto a friend without a console - guess what, that's one more customer brought into the console ecosystem.

The sales model for consoles is about selling subscriptions and games. There's really not much profit in the consoles themselves so it makes competitive sense to offer the most appealing console you can at all times.
 

Rei

Joined
Aug 1, 2020
Messages
656 (0.48/day)
Location
Guam
System Name 1 Desktop/2 Laptops/1 Netbook
Processor AMD Athon X2 64/Intel Pentium 997/Intel Pentium 4/Intel Atom
Motherboard EpoX ATX motherboard/Samsung/Toshiba/Lenovo
Cooling Stock
Memory 4 GB/4 GB/2 GB/2 GB
Video Card(s) Asus GeForce GTX 780 Ti/Intel HD Graphics/GeForce 4MX/Intel GMA
Storage 6+ TB Total
Display(s) HP Pavilion 14 Inch 1024x768@60Hz 4:3 Aspect Ratio CRT Monitor
Case None
Audio Device(s) Various
Power Supply Seasonic 500 Watt & VenomRX 500 Watt
Mouse Wayes Iron Man Wireless Mouse
Keyboard Rexus VR2 Wireless Keyboard
Software Win10 & WinXP SP3
Benchmark Scores It sucks...
Look at it from Microsoft and Sony's perspective, in a year from now AMD and TSMC will be able to produce the same silicon design on a cheaper process that also requires less power and cooling enabling a cheaper console. That lets them sell the "same" console at a lower entry price, enticing more consumers into their console ecosystem and making them more profit.
This part isn't called a refresh. It's a revision & has already been done before with PS1-4, Xbox 360, Xbox One S, Wii, etc.
At the same time, they can offer an upgraded console that delivers 4K60 where the previous console maybe only did 4K30 and hardcore gamers will buy that console as an upgrade, which is yet another sale for the console manufacturer and potentially one more console passed onto a friend without a console - guess what, that's one more customer brought into the console ecosystem.

The sales model for consoles is about selling subscriptions and games. There's really not much profit in the consoles themselves so it makes competitive sense to offer the most appealing console you can at all times.
This is unlikely gonna happen in a controlled ecosystem if the appeal is just a minor boost in performance. For a mid-gen refresh, they would need a larger motivation such as 8K support without graphical fidelity & performance being worse than a 4K equivalent like they did with last gen's refresh otherwise the new refresh is likely gonna flop. And 8K adoption won't happen for a while without some hurdles to jump through such as the storage capacity vs. pricing I mentioned in my previous post.
 
Joined
Feb 20, 2019
Messages
7,342 (3.86/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
This part isn't called a refresh. It's a revision & has already been done before with PS1-4, Xbox 360, Xbox One S, Wii, etc.
Potayto, potahto. Similar specs as the original, but sold cheaper as a slim/lite/XS or whatever.

This is unlikely gonna happen in a controlled ecosystem if the appeal is just a minor boost in performance.
I don't think you understood me; I'm not speculating or guessing, I'm reporting stuff that has already been confirmed; Phil Spencer, head of XBOX development has said that both the refresh/revision and successor to the Series X were already in development back in September, according to the Kotaku interview and he's answered further probing via Twitter and in a weird (but popular) Nintendo twitch stream, too.

So, whilst the successor and a refresh/revision are both definitely coming, we don't know exactly when and I suspect Microsoft won't make any announcements until the shine has worn off the fresh new Series X. My guess is that it'll be the 2022 holiday season, but could easily be before then due to TSMC constraints and the Series X silicon being relatively expensive compared to previous consoles. They'll cut costs with revised silicon as soon as they possibly can and take advantage of whatever generational improvements AMD have made to Zen and RDNA since then without breaking compatibility.

Given dev feedback, improved raytracing performance is low-hanging fruit for AMD/Microsoft - and tweaks to provide DLSS-equivalent (Fidelity FX with intelligent VRS, perhaps?) that might make 4K60 more achievable. Almost everyone has at least 4K60 TV but neither the PS5 nor XBSX can realistically do current games at 4K60. They still have to use dynamic resolution scaling and some games just stick to 30fps instead. CP2077 is perhaps the first major launch for the new consoles and 4K60 isn't happy on either of them.
 
Last edited:
Joined
Jul 5, 2013
Messages
25,572 (6.46/day)
IMO 10GB is enough for now. It probably won't be enough at some point next year, but people who splurge on flagship vanity cards at 3x the cost of anything in the performance/$ sweet spot aren't likely to give a shit about their card next year, because they'll be buying the new model that comes out then, anyway.
Actually, those are really good points!
 

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.17/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Thing is, you can always turn a texture setting down from ultra to high, and drop that VRAM usage. 8GB is going to be the 'high end' target for almost all game devs, as they know its incredibly common

Hell the GTX 1060 6GB dominates the steam charts, so 6GB at medium/high settings seems like a reasonable target for most titles.

Unlike some games (cough CP2077) most games you can turn the textures or AA down and VRAM usage plummets, whereas GPU usage you're kinda screwed if you dont have enough
 
Joined
Jul 13, 2016
Messages
2,860 (1.00/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
That killer app was Control when it comes to visuals.
2080Ti was released in September 2018. 3000-series came September 2020. 2 Years.
Big fat RT tax seems to be questionable at best. RT Cores take up about 3% of the die if not less.

I'd have to disagree

Here is a video demonstrating RTX in control:


To be honest I'm getting the same quality of reflections with my 1080 Ti in Cyberpunk 2077 with RTX off.

Unless you are specifically cherry picking examples like the RTX trailer does for nearly every game, there are multiple ways to do rasterized reflections that are high quality without ray tracing.

What about the new RTX IO technology? I believe this is going to fix the VRAM lack in the card (if needed it) by accessing faster to the M2 pcie 4 SSD card

No, accessing M.2 PCIe wil still be many times slower than VRAM. There's no way to get around the fact that the physical distance of the VRAM to the GPU is a tiny fraction of to the NVME SSD. On the 30xx series it's pushed up really close to the GPU. RTX IO technology is only to address the increasing complexity and in use asset sizes in games, not to diminish VRAM requirements.

This is kind of a pointless discussion, both sides are arguing over "what if" scenario in 2-3 years.

In 2-3 years 3080 won't be enough to run at 4K at ultra anyway, that's expected.
16 GB of VRAM won't bottleneck you at 4K but GPU performance is based on more than just VRAM.
In 2-3 years Nvidia will be on Gen 3-4 of RT implementation, AMD will probably be on Gen 2-3. The odds are that GPU's from early gens will be useless for anything above 1080p medium RT.
If both Sony and MS release mid-gen variants of consoles then what's achievable by XSX/PS5 with RT will become a baseline for "RT LOW" setting.

Resolution doesn't have as much of an effect on memory consumption as you think. CP2077 shows maybe a 400MB difference between 1440p and 4K.

If you think dropping the resolution from 4K to 2K is going to save these cards memory wise, think again. 2-3 years is a pretty pitiful life expectancy as well no? I've had my 1080 Ti for longer than 3 years and have yet to have to drop the texture settings. Then again it does have more VRAM then cards just released. Imagine that, a card actually designed to last.

It's a good thing 10GB will be fine for a generation's worth of time (really, one year won't suddenly change anything). Sure, we all want more there, but as good as it is to have a buffer, it's a waste of money when you aren't using close to its capacity. If you play at 4k/ultra perhaps it's good to have towards the end of its life in a few years...

We are already using the entire VRAM amount or more depending on the resolution and game. The only buffer you have left is the game engine and drivers frugally managing your VRAM to avoid massive stuttering.

The 1080 Ti has 11GB and is almost 4 years old. Do you honestly think it's good for games or customers to be buying a 3080 for the same price with only 10GB? Do you honestly believe that doesn't limit game devs? That's right now, forget about the future. 1 year, it might be fine. Will it last another 4 years like the 1080 Ti has? Most likely not nor should it, a VRAM decrease to 10 GB over a period of 8 years from a developer standpoint has got to be a joke.

There is no defense for the amount of VRAM on Nvidia's new cards, aside from the 3090. It's paltry. People defending this are holding back game devs. A newly released card should not be at it's VRAM limit right out of the gate. You can't make games that utilize more VRAM when no cards on the market exist with more VRAM aside from a $1,500 prosumer product. People forget that a VRAM buffer isn't just for the longevity of the card itself, it's also to push devs to use that VRAM down the line. Software follows hardware, not the other way around.

Not saying we need 16GB (12GB would have been fine) but the current situation is poor. You can argue till the cows come home about whether X or Y amount of VRAM is required for game A or B but what isn't arguable is that stagnant or even a net negative to VRAM on video cards has the potential to hold back game developers and by extension games. It stands to reason that software follows hardware, therefore hardware with more VRAM has to be released before developers are able to utilize it.
 
Last edited:
Joined
Sep 17, 2014
Messages
20,968 (5.96/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Thing is, you can always turn a texture setting down from ultra to high, and drop that VRAM usage.

Unlike some games (cough CP2077) most games you can turn the textures or AA down and VRAM usage plummets

You assume you will maintain equal control over your settings in games as you always have, while we keep getting a longer list of titles that really don't offer that control at all. And the new consoles, especially ports who utilize new technologies, are going to simplify those settings further into dynamic and probably hard to control setups - OR - you'll find you have to push sliders down pretty damn far to get a desired effect.

But, yes, you can turn settings down... I can also do that on a 2016 GTX 1080 with 8GB :toast:
The assumption you won't have core power on tap for a GPU with much more than 200% the oomph and a mere 120% of VRAM is a very strange one, in a relative sense...
 
Joined
Mar 21, 2020
Messages
77 (0.05/day)
yes it is.

if you play 4K games with 27" o monitor, now knowing one (19 game what needs little bit more,and that all highest options top.
if you playing anyhting lower, no problem...now and future....
 
Joined
Jun 7, 2020
Messages
2 (0.00/day)
Processor Intel Core i7-13700K @ 5.7-6 GHz, 5.1 GHz Ring (16 Cores, HT disabled)
Motherboard ASUS ROG Maximus Z790 HERO
Cooling Corsair iCUE H170i Elite Capellix 420mm AIO
Memory 64 GB G.Skill Trident Z5 RGB DDR5 (F5-6000J3238G32X2) @ 6400 Mbps 30-37-37-28
Video Card(s) EVGA GeForce RTX 3080 FTW3 HYBRID 12 GB G6X @ 2050 MHz Core, 10802 MHz (21.6 Gbps) Memory
Storage x2 Sabrent Rocket NVMe PCIe 4.0 SSDs 3 TB, WD Black 10 TB
Display(s) LG UltraGear 32GP750 1440p 165Hz, Samsung UN225003BF 1080p 60Hz, TCL 50S435 4K 60Hz
Case Corsair 7000D AIRFLOW
Audio Device(s) Creative Sound BlasterX G6 - Beyerdynamic DT 990 Pro LE 250 ohm & Blue Yeti X Mic
Power Supply Corsair RM1000x 80+ Gold
Mouse Logitech G513 Carbon GX Brown
Keyboard Logitech G502 HERO
Software Windows 11 Pro
I don't necessarily believe the argument of 10 GB being enough for today's games is more of a problem rather than the fact that the x70 SKU has been stuck on 8 GB for almost 5 years now. There's been absolutely NO progression whatsoever for x70 and below since Pascal... and Pascal's already nearing 5 years old. Yes, NVIDIA finally gave the x80 a bump to 10 GB, but even the GTX TITAN X from Maxwell had more VRAM than that back in early 2015. Of course, that's comparing a TITAN card versus just a mere mortal GeForce product, but I'm just trying to show the example of how stagnant NVIDIA's been with increasing VRAM amounts for quite some time now. Pascal doubled Maxwell's VRAM amount across the board, tripled it from the GTX 960 to 1060.

It's even funnier that an RTX 3060 12 GB is rumored, which has more VRAM than a 3080. I don't believe the 3060 will benefit from it as it's too weak of a GPU itself to take advantage of it; but, this just goes to show that NVIDIA's making a mess out of their product stack when they could've just went 12 GB for the 3080 with a 384 bit bus. Even with G6 16 Gbps memory on a 384 bit bus, it would still have more VRAM than G6X with 19 Gbps on a 320 bit bus (768 GB/s, G6 384 bit vs. 760 GB/s with G6X, 320 bit). Perhaps it'd even be the cheaper route, too. Plus, the 3070 could have had 10 GB on a 320 bit bus, with the same G6 memory at 16 Gbps, but now with even more bandwidth.
 
Joined
Aug 15, 2016
Messages
486 (0.17/day)
Processor Intel i7 4770k
Motherboard ASUS Sabertooth Z87
Cooling BeQuiet! Shadow Rock 3
Memory Patriot Viper 3 RedD 16 GB @ 1866 MHz
Video Card(s) XFX RX 480 GTR 8GB
Storage 1x SSD Samsung EVO 250 GB 1x HDD Seagate Barracuda 3 TB 1x HDD Seagate Barracuda 4 TB
Display(s) AOC Q27G2U QHD, Dell S2415H FHD
Case Cooler Master HAF XM
Audio Device(s) Magnat LZR 980, Razer BlackShark V2, Altec Lansing 251
Power Supply Corsair AX860
Mouse Razer DeathAdder V2
Keyboard Razer Huntsman Tournament Edition
Software Windows 10 Pro x64
I would rather run out of VRAM and be able to lower some 'not so essential' setting(s) than to have 20 GB of VRAM whereas you won't have the needed processing power. Knowing nvidia this won't be a $10 increase anyway.
 
Joined
Dec 31, 2009
Messages
19,366 (3.70/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
We are already using the entire VRAM amount or more depending on the resolution and game
Look at any of the games reviewed at tpu. At 4k using ultra settings there is ONE title that uses more than 10GB. One. The rest are well under that, typicay a few GB. I'm not worried about that snowball getting much bigger (in this case, that title doesn't have any hitching issues its an allocation vs actual use thing). ;)
 
Last edited:
Joined
Sep 17, 2014
Messages
20,968 (5.96/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
I don't necessarily believe the argument of 10 GB being enough for today's games is more of a problem rather than the fact that the x70 SKU has been stuck on 8 GB for almost 5 years now. There's been absolutely NO progression whatsoever for x70 and below since Pascal... and Pascal's already nearing 5 years old. Yes, NVIDIA finally gave the x80 a bump to 10 GB, but even the GTX TITAN X from Maxwell had more VRAM than that back in early 2015. Of course, that's comparing a TITAN card versus just a mere mortal GeForce product, but I'm just trying to show the example of how stagnant NVIDIA's been with increasing VRAM amounts for quite some time now. Pascal doubled Maxwell's VRAM amount across the board, tripled it from the GTX 960 to 1060.

It's even funnier that an RTX 3060 12 GB is rumored, which has more VRAM than a 3080. I don't believe the 3060 will benefit from it as it's too weak of a GPU itself to take advantage of it; but, this just goes to show that NVIDIA's making a mess out of their product stack when they could've just went 12 GB for the 3080 with a 384 bit bus. Even with G6 16 Gbps memory on a 384 bit bus, it would still have more VRAM than G6X with 19 Gbps on a 320 bit bus (768 GB/s, G6 384 bit vs. 760 GB/s with G6X, 320 bit). Perhaps it'd even be the cheaper route, too. Plus, the 3070 could have had 10 GB on a 320 bit bus, with the same G6 memory at 16 Gbps, but now with even more bandwidth.

If Nvidia had snagged TSMC 7nm for Ampere they very well might have been able to.

I think with their wishlist they just couldn't cram more than this into the already heavily expanded TDP budgets to meet Navi's performance at a favorable price point. They would have to make an even bigger die to do so, cutting further into margins and therefore risk. And there is already a gap between Navi's die size and Ampere's. Its the polar opposite of the gap Nvidia had going for it in every gen for the last years.

The cost of RT ;)
Not on my wallet, that's for sure.
 
Joined
Jul 5, 2013
Messages
25,572 (6.46/day)
I would rather run out of VRAM and be able to lower some 'not so essential' setting(s) than to have 20 GB of VRAM whereas you won't have the needed processing power. Knowing nvidia this won't be a $10 increase anyway.
Can't agree with this. It's always better to have more RAM. Always.
 
Joined
Dec 31, 2009
Messages
19,366 (3.70/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
Can't agree with this. It's always better to have more RAM. Always.
Unless you're that guy... and this guy... and whoever else feels this way. :p

As you know, the difference between ultra and high is often tough to distinguish without pausing and comparing stills. That coupled with less of a need of high AA at 4K UHD, there are a couple of settings that can be turned down to save some vRAM without a notable loss of IQ. :)

I'm an ultra guy myself (2560x1440/144), but would rather save $100(?) and make a couple of tweaks at the EOL personally...others don't mind the paying the premium and not using the extra ram 90% of the time over the typical life cycle of a gpu (a few years). There is certainly more than one way to skin this cat. ;)
 
Last edited:
Joined
Jul 5, 2013
Messages
25,572 (6.46/day)
I'm an ultra guy myself (2560x1440/144)
I'm not. But experience has taught me time and again that having more RAM is ALWAYS better than not having enough. Fraking ALWAYS!
but would rather save $100(?) and make a couple of tweaks at the EOL personally...
I learned in the 80's that anytime a company released a card with a certain amount of RAM installed it wouldn't be long before they(or a partner) released a similar card with more(often double) the amount of RAM for a reasonable cost. I generally tend to wait for those cards, with few exceptions. As you might imagine, I'm going to wait until a 16GB 3070(ti?) or 3080 variant with 16GB+ VRAM is released and get it. 8GB or 10GB is just not enough for future gaming titles. It's barely enough for the latest new hotness. 2021 is going to see several titles release that will push the limits again.

With the Pascal and Tensor gen cards there were no expectations of expanded VRAM cards because no games were getting close to the VRAM limit, at that time. Radeon 7 with it's 16GB of VRAM was viewed as superfluous, and maybe is was. But fast forward to today with this latest gen of cards, games are getting very close to the VRAM limits and AMD, correctly, has planned ahead with 16GB cards. NVidia is going to join them shortly as they can see games pushing up against that VRAM limit and want to offer solutions to keep gamers happy. This is not a new concept. This is history repeating itself yet again.
 
Last edited:
Joined
Dec 31, 2009
Messages
19,366 (3.70/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
But experience has taught me time and again that having more RAM is ALWAYS better than not having enough. Fraking ALWAYS!
The absolute nature of your blanket statement has no legs. It does with how you use your PC and spend your money (the irony isn't lost that you aren't an ultra guy but still bellowing about 10GB not being enough, lol), but there are other ways to do so as I just explained. 'Always' simply isn't true. ;)

2021 is going to see several titles release that will push the limits again.
I'm not holding my breath... consoles rock 13.5GB of a shared pool... RAM and vRAM... so it likely isn't those that are pushing more vRAM use. Sure it will go up with time, but as we've seen in a small cross section of game reviews here, there is literally one title that uses over 8GB of vRAM at 4K (IIRC, there may be two? - maybe that was 10GB, who knows, I'm not looking, lol). But I also addressed that point with allocation and use/what you will experience in the game as well. ;)

But fast forward to today, games are getting very close to the VRAM limits
They aren't though. Please read some of TPU's game reviews and focus on the vRAM use at 4K/Ultra. The ONLY reason 10GB concerns me on this card is if you run 4K, plan on using mods in games, and keeping this thing for more than a few years. Otherwise, you'll be fine for the next couple of years running Ultra or the next few with tweaking a few titles down. Still, the vast majority of titles will fit within the onboard vRAM buffer for years.

EDIT: It's too bad that the Radeon 7 would be a potato trying to run some titles at 4K Ultra/60... it's not an overall 4K 60 card at 8GB, nor at 16GB. Are there any titles that trip over 8GB at 2560x1440? So the R7 is a great example of too much vRAM for the appropriate res and not enough horsepower to use it at 4K. Over half the titles in the TPU review are below 60 FPS at 4K, a few WELL below that value and what many would call 'unplayable'. ;)

EDIT2: For giggles, I looked through the last 10 of TPU's game performance reviews.. here are the accurate 4K values (max, some includes RT)...

Cyberp 2077 - 7.1/9.9 (w/o RT, w/ RT)
Godfall - 8.6GB
AC: Vallhalla - 6.1GB
Watchdogs - 7.4GB
Star Wars - 4.6 GB
Horizon Zero Dawn - 8.6GB
Death Stranding - 4.9GB
Gears Tactics - 6.2GB
Res Evil 3 - 7.2/7.7 (DX11/DX12)
Doom Et - 8.3GB

I would say one title is very close (over 9GB) approaching the limit. The others, most, are not remotely close using 7.7GB or less. Maybe there are other titles that show more, but just giving 10 examples of titles that don't hit 10GB.... which happens to be the latest 10 titles he's tested. You can continue on down the list if you want. :)
 
Last edited:
Joined
Feb 20, 2018
Messages
41 (0.02/day)
10gb is likely enough for 99% AAA titles out currently. but for emulation or games that offer the ability to use enhanced texture packs it's a major limitation. since you often don't need a lot of GPU core power but you do need a ton of vram. capacity aside I'd like amd and nvidia stop using 1GB memory chips so GPUs get smaller for once. but for that we need memory companies to stop boosting clockspeeds at lower voltages and cut latency to under 16millisecconds.
 
Joined
Nov 11, 2016
Messages
3,077 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
Lol dont be like HUB who love to make future predictions when perf/dollar is not in AMD's court.
If RTX/DLSS is only available in 1% of games, there are far fewer games than that which require more than 10GB VRAM and therefore it is statistically irrelavent (as you can avoid those games altogether).
Remember Godfall's 12GB VRAM bullcrap ? yeah AMD marketing was trying pretty hard there. HUB was shilling pretty hard then tried to come out as victims when Nvidia PR team wanted to cut tie with them.
 
Last edited:
Joined
Sep 17, 2014
Messages
20,968 (5.96/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
The absolute nature of your blanket statement has no legs. It does with how you use your PC and spend your money (the irony isn't lost that you aren't an ultra guy but still bellowing about 10GB not being enough, lol), but there are other ways to do so as I just explained. 'Always' simply isn't true. ;)

I'm not holding my breath... consoles rock 13.5GB of a shared pool... RAM and vRAM... so it likely isn't those that are pushing more vRAM use. Sure it will go up with time, but as we've seen in a small cross section of game reviews here, there is literally one title that uses over 8GB of vRAM at 4K (IIRC, there may be two? - maybe that was 10GB, who knows, I'm not looking, lol). But I also addressed that point with allocation and use/what you will experience in the game as well. ;)

They aren't though. Please read some of TPU's game reviews and focus on the vRAM use at 4K/Ultra. The ONLY reason 10GB concerns me on this card is if you run 4K, plan on using mods in games, and keeping this thing for more than a few years. Otherwise, you'll be fine for the next couple of years running Ultra or the next few with tweaking a few titles down. Still, the vast majority of titles will fit within the onboard vRAM buffer for years.

EDIT: It's too bad that the Radeon 7 would be a potato trying to run some titles at 4K Ultra/60... it's not an overall 4K 60 card at 8GB, nor at 16GB. Are there any titles that trip over 8GB at 2560x1440? So the R7 is a great example of too much vRAM for the appropriate res and not enough horsepower to use it at 4K. Over half the titles in the TPU review are below 60 FPS at 4K, a few WELL below that value and what many would call 'unplayable'. ;)

EDIT2: For giggles, I looked through the last 10 of TPU's game performance reviews.. here are the accurate 4K values (max, some includes RT)...

Cyberp 2077 - 7.1/9.9 (w/o RT, w/ RT)
Godfall - 8.6GB
AC: Vallhalla - 6.1GB
Watchdogs - 7.4GB
Star Wars - 4.6 GB
Horizon Zero Dawn - 8.6GB
Death Stranding - 4.9GB
Gears Tactics - 6.2GB
Res Evil 3 - 7.2/7.7 (DX11/DX12)
Doom Et - 8.3GB

I would say one title is very close (over 9GB) approaching the limit. The others, most, are not remotely close using 7.7GB or less. Maybe there are other titles that show more, but just giving 10 examples of titles that don't hit 10GB.... which happens to be the latest 10 titles he's tested. You can continue on down the list if you want. :)

Shared pool or not, what do you expect, game logic on its own isn't that RAM intensive. If its 2GB, you're being generous. A full blown Windows 10 install is a mere 4 GB right now, and can be much lower too.

And, again... storage. Consoles will be using a special cache system.

And as for the examples being benched in TPU reviews right now - if you have a couple, that's already a sizeable percentage of the games being benched, is it not? Its only fair to translate that to the entire catalogue of games coming out. I don't recall ever playing only what W1zzard wants to bench :)

Might be wrong, but I sense some shifting goalposts here... slowly but surely. And we're still only waiting for the cards to be readily available :D
 
Last edited:
Joined
Feb 20, 2019
Messages
7,342 (3.86/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
AMD has the right idea with the 6000 series Radeons having 16GB from the get-go.
My feeling, knowing a few game devs in person, is that they work with the common hardware constraints. The tools to make ultra-high quality art assets are readily available - developers are basically making textures and meshes automatically from photography, 3D scans, and pointclouds now. If 16GB VRAM becomes common then they'll design levels that utilise 16GB of art. It's really not any extra work for them, they just need to move the slider a little further to the right when using the compress-O-tron to reduce art assets to what fits into common VRAM sizes, 2GB, 4GB, or 8GB for example.

So some devs, at least, have been automatically ready for 16GB and 32GB cards for half a decade or more. The tools to do it effortlessly are mainstream industry standards. It's what you see on that splash screen in many games with all the technology logos. Devs aren't manually putting in extra effort to hand-make their own optimisation and modelling tools, they're already out there in mainstream use. Heck, our in-house AEC visualisation studio uses them on a daily basis, I spend several hours a month troubleshooting asset conversion, interoperability and other niggles for those applications.

We've had 8GB cards for over 6 years at the high end (290X) and almost 5 years at the midrange (RX470). That's right, 8GB has been at the mainstream, mass-market price point for HALF A DECADE.
 
Joined
Sep 17, 2014
Messages
20,968 (5.96/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
My feeling, knowing a few game devs in person, is that they work with the common hardware constraints. The tools to make ultra-high quality art assets are readily available - developers are basically making textures and meshes from photography now. If 16GB VRAM becomes common then they'll design levels that utilise 16GB of art. It's really not any extra work for them, they just need to move the slider a little further to the right when using the compress-O-tron to reduce art assets to what fits into common VRAM sizes, 2GB, 4GB, or 8GB for example.

So some devs, at least, have been automatically ready for 16GB and 32GB cards for half a decade or more. The tools to do it effortlessly are mainstream industry standards. It's what you see on that splash screen in many games with all the technology logos. Devs aren't manually putting in extra effort to hand-make their own optimisation and modelling tools, they're already out there in mainstream use. Heck, our in-house AEC visualisation studio uses them on a daily basis, I spend several hours a month troubleshooting asset conversion, interoperability and other niggles for those applications.

We've had 8GB cards for over 6 years at the high end (290X) and almost 5 years at the midrange (RX470). That's right, 8GB has been at the mainstream, mass-market price point for HALF A DECADE.

The tools are even so commonly available, that most modders are keen to use them as well and present us with better quality packs for games ;)
 
Joined
Dec 31, 2009
Messages
19,366 (3.70/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
Shared pool or not, what do you expect, game logic on its own isn't that RAM intensive. If its 2GB, you're being generous. A full blown Windows 10 install is a mere 4 GB right now, and can be much lower too.

And, again... storage. Consoles will be using a special cache system.

And as for the examples being benched in TPU reviews right now - if you have a couple, that's already a sizeable percentage of the games being benched, is it not? Its only fair to translate that to the entire catalogue of games coming out. I don't recall ever playing only what W1zzard wants to bench :)

Might be wrong, but I sense some shifting goalposts here... slowly but surely. And we're still only waiting for the cards to be readily available :D
Slowly but surely is right. I'm not saying it won't increase, I'm saying there is plenty of room (for most titles) to have an increase while still slotting in under 10GB at 4K/Ultra (and again, you can lower some to high and not use copious amount of AA). Some of these over 100% before it gets to 10GB.
TPU reviews right now - if you have a couple,
I looked, there are NONE that go over 10GB... one is close at 9.9 with RT enabled, however. Please see the edit where I listed each of the last 10. :)
 
Status
Not open for further replies.
Top