• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Are game requirements and VRAM usage a joke today?

Joined
Oct 14, 2007
Messages
633 (0.10/day)
Location
Shelby Township, MI
System Name MSI GT77HX
Processor Intel i9-13980HX
Memory 64 GB DDR5 @ 4800 mhz
Video Card(s) NVIDIA RTX 4090 Mobile
Storage 2 TB 980 Pro
Display(s) 4K/144 Hz Mini-LED
Benchmark Scores 23,616 Timespy Graphics
I believe it has to do with better textures, RT makes things worse (another reason why it's useless).
RT is most certainly not useless. Plenty of games look dramatically better with it on.

You can't trust software GPU usage at all. Not even in-game numbers (some games shows you xx/xx GB usage for example).
+ Many game engines just allocate xx% by default. Tons of games allocate more than they need. Like 85 or 90% of all available VRAM, yet uses half of that in reality.

Also, Nvidia has several features to limit VRAM requirement (cache hit/miss requests in Ada arch for example + better memory compression - you can read more about this in architecture deep dives)

Pretty much no games use more than 12GB in 4K/UHD using ultra settings. With heavy RT you might go above this tho, yet no 12 or 16 GB cards will manage heavy RT at 4K/UHD anyway. Especially not AMD cards, since RT perf is very weak. Pointless.


Most of those games you talk about, is AMD sponsored games and rushed console ports -> https://www.amd.com/en/gaming/featured-games.html

Properly optimized games use little VRAM. Atomic Heart completely maxed out at 4K/UHD uses like 7GB and looks better than 99% of games.

Generally pretty much no games needs more than 8-10GB at 1440p. Most hovers around 4-5-6GB usage. 12GB is more than enough. 16GB or more is wasted.

Look here. 3070 8GB outperforms 6700XT 12GB with ease in new games at 4K/UHD in terms of minimum fps. Minimum fps would drop very low if VRAM was an issue -> https://www.techpowerup.com/review/amd-radeon-rx-7800-xt/35.html

All this VRAM talk started because of AMD marketing and a few rushed console ports (AMD sponsored as well). Very few properly optimized games uses alot of VRAM. Like I said -> https://www.techpowerup.com/review/atomic-heart-benchmark-test-performance-analysis/5.html


AMD was catched before doing this stuff. Back with Shadow of Mordor uncompressed texture pack - that did nothing for the end user -> https://www.pcgamer.com/spot-the-di...rdor-ultra-hd-textures-barely-change-a-thing/

They only did this to make many Nvidia cards drop performance (however also affeted many AMD users)

The difference between high and ultra textures is often just slightly less compression (sometimes ultra is uncompressed), which you most of the time won't notice when actually playing the game. Dropping texture quality to low and sometimes medium can be seen easily, but high and ultra mostly looks identical, especially without 300% zoom and in motion.


I have a 4090 and 24GB is absolutely wasted for gaming. Outside of allocation only, it's simply not needed.

By the time 24GB is actually needed in demanding games maxed out, 4090 will belong in the trashbin. GPU will be too slow to run games max out anyway.


Some people think alot of VRAM will matter eventually, they just don't account for the GPU which will be too weak to max out games anyway, meaning less VRAM is required.


You have to be logical. Game developers knows that the majority of PC gamers don't have more than 8GB.

PS5 and XSX have 16GB shared RAM for entire system, meaning OS and BACKGROUND TASKS, GAME and GRAPHICS with a 4K/UHD resolution target (dynamic res tho).
You can definitely play games at 4k with RT on 16 GB VRAM cards. I do so fairly often. I've also already seen game that requires over 16 GB at max settings at 4K. Hogwart's Legacy. It doesn't just allocate all 16 GB VRAM I have, it starts using my RAM as VRAM in addition to this. It uses about 19 GB in total.
 
D

Deleted member 185088

Guest
RT is most certainly not useless. Plenty of games look dramatically better with it on.
Which games? Control has pathetic textures and geometry, the same with CP2077 which still use geometry comparable to the PS1 era. I do admit that UE5 is impressive, because it covers everything, textures, geometry, LOD and lighting.

Another issue is the performance hit, if we're talking about 5 to 10% it would be acceptable, but halving the performance is stupid. The technology hasn't caught up yet.
 
Joined
Oct 14, 2007
Messages
633 (0.10/day)
Location
Shelby Township, MI
System Name MSI GT77HX
Processor Intel i9-13980HX
Memory 64 GB DDR5 @ 4800 mhz
Video Card(s) NVIDIA RTX 4090 Mobile
Storage 2 TB 980 Pro
Display(s) 4K/144 Hz Mini-LED
Benchmark Scores 23,616 Timespy Graphics
You've got to be trolling - or blind. Literally not worth debating with someone who thinks Cyberpunk 2077 looks like a PS1 game. The game is gorgeous, so is Control.
 
Joined
Jul 13, 2016
Messages
2,845 (1.00/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
so what's the point of DLSS then?

While DLSS doesn't help VRAM issues, it does still provide a better experience in non-VRAM limited scenarios (assuming you need the extra frames). As HardwareUnboxed put it, many of Nvidia's features like DLSS and RTX work best on their high end cards. VRAM and memory bandwidth are just being used by Nvidia to control what a card can do and for how long. Most people are going to be buying Nvidia's 8GB cards given those cover all the way up to $499 (4060 Ti 16GB at $500). In that price range you don't get competent ray tracing performance (on par with the 2080) in current games, let alone future ones. In addition you aren't given enough VRAM to max out all current games and certainly not future games 3-5 years from now. The memory bandwidth is abysmal to boot on all cards at or below that figure.

Ultimatley the point of DLSS, the same as RTX or any of the older GameWorks technologies has been to increase sales. I remember when Nvidia bought Ageia for PhysX, they stripped out the CPU code path that was working great on a variety of systems and replaced it with a CUDA code path that only worked well on CUDA based Nvidia GPUs. This nuked performance on AMD and Nvidia cards that did not have CUDA. Nvidia cards at the time of which were only conveniently recently replaced by a new generation. Funny how that works. Game sponsorships are never good for gamers but it's become industry standard for AMD and Nvidia to sponsor titles because both companies realize that it essentially enables a verticle monopoly. People buy the hardware (GPUs) to run software (games) so by controlling the performance and features of the software you have a massive amount of influence over GPU sales. DLSS and RTX heavily rely on this strategy instead of waiting for organic adoptation. The advantage for Nvidia was that AMD was essentially bankrupt until after Polaris, so Nvidia was free to sponsor titles and place immense control over the features and performance of the most popular games releasing. It's the opposite of what you'd call a free market. It's frankley astounding to me that gamers don't see the danger in further supporting proprietary technology integration in games by the graphics card makers themselves (DLSS, FG, Reflux, ect). People want more GPU manufacturers but they don't seem to realize that all of those vendor specific features like Reflux simply won't work on any other brand's card. The market desperately needs more options that are hardware agnostic but even AMD is taking the Nvidia route. At the end of the day the GPU market is getting into an increasingly difficult pickle where we are getting more and more vendor specific features and gamers are more and more acting like it's a 2 party American polical system when at the end of the day it's only themselves they are hurting when they purchase video cards for features only that brand will ever be able to use.
 
Last edited:
Joined
Oct 14, 2007
Messages
633 (0.10/day)
Location
Shelby Township, MI
System Name MSI GT77HX
Processor Intel i9-13980HX
Memory 64 GB DDR5 @ 4800 mhz
Video Card(s) NVIDIA RTX 4090 Mobile
Storage 2 TB 980 Pro
Display(s) 4K/144 Hz Mini-LED
Benchmark Scores 23,616 Timespy Graphics
DLSS does help with VRAM issues..... I had to use DLSS in Hogwart's to get a playable experience, and it did result in its VRAM usage decreasing. Even in games where VRAM isn't the issue though, my VRAM usage absolutely goes down if/when I turn on DLSS.

Example below:

This is Cyberpunk 2077 at 4K RT Overdrive - no DLSS:

15.4/16 GB VRAM allocation


12.5/16 GB VRAM allocation with DLSS Quality enabled


11.9 GB/16 GB VRAM allocation: DLSS Balanced


10.9 GB/16 GB VRAM allocation: Ultra Performance

No matter how powerful my GPU might be, it would be dead in the water at 4k if it had 12 GB VRAM instead of 16 GB, even with 16GB, there's a chance that at 4K it'd exceed my 16 GB and start using RAM.

But with DLSS on Balanced, Performance, or Ultra Performance, allocation drops below 12 GB. DLSS can absolutely help GPU's with lower amounts of RAM to play at higher "output" resolutions.

If the RTSS overlay seems misleading due to labeling it as a 4090, and showing 16 GB VRAM, it's because this is a laptop. That is actually a desktop 4080 with slower GDDR6 (instead of GDDR6X) and a 175 wattage limit put in place.
 
Last edited:
Joined
Sep 17, 2014
Messages
20,953 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
RT is most certainly not useless. Plenty of games look dramatically better with it on

It isn't always, but I'm doing my photo mode sightseeing tour of Night City (still ongoing) and every time I compare RT ON (highest quality setting) to Cyberpunk's raster implementation with all things maxed out, more often than not, I struggle to see a difference, and if its there, its never really an improvement. Its different - and only if you look for it. When you turn on Psycho local/screen space reflections, you already have 95% of the RT experience right there, even without turning it on, but at double the FPS.

Without path tracing, honestly, I don't see the point of RT at all. Path tracing turns the 'somewhat different implementation of raster lighting that is regular RT' into a completely different, much more refined, lighting affair that can completely change a scene and is generally also an improvement in the whole presentation of it - even though in many cases it makes the overall picture darker. In other situations, it almost matches the non-RT picture. The overall picture though... yeah. I'd pick Path traced over raster every time.

This one (path traced version below) was the most impressive difference in that sense, compare this to RT off... its a different thing altogether. I'll add another screenshot later tonight under this one when I redo Delamain's final mission.

Cyberpunk2077_2023_10_13_23_03_19_974.jpg


Here's the raster version:

Cyberpunk2077_2023_10_14_20_19_45_858.jpg


This one though? Its nearly identical to the raster image, even path traced. The most marked improvement is how the mirrors reflect light, here they form that perfect circle, in raster they don't.

Cyberpunk2077_2023_10_13_21_16_01_829.jpg
 
Last edited:
Joined
Oct 14, 2007
Messages
633 (0.10/day)
Location
Shelby Township, MI
System Name MSI GT77HX
Processor Intel i9-13980HX
Memory 64 GB DDR5 @ 4800 mhz
Video Card(s) NVIDIA RTX 4090 Mobile
Storage 2 TB 980 Pro
Display(s) 4K/144 Hz Mini-LED
Benchmark Scores 23,616 Timespy Graphics
RT doesn't make a giant difference in every setting, but I definitely prefer to play games with it on. Would I recommend it to someone using a 2060 or 6600 XT? No, but for those with GPU's capable of using it without major compromises, I absolutely consider it to be worth using. The fps impact is much larger %-wise on lower end GPU's, especially at low resolutions. There are games where my fps will barely change if I turn RT on at a low resolution, because at such a resolution I was cpu bottlenecked with it off.
 
Joined
Mar 7, 2023
Messages
538 (1.28/day)
System Name BarnacleMan
Processor 14700KF
Motherboard Gigabyte B760 Aorus Elite Ax DDR5
Cooling ARCTIC Liquid Freezer II 240 + P12 Max Fans
Memory 32GB Kingston Fury Beast 5600 cl36 Oc'd to 6000 cl32
Video Card(s) Asus Tuf 4090 24GB (non-oc version, undervolted)
Storage 2TB Netac nv7000 + 2TB P5 Plus + 2TB SN850X + 2*(4TB MX500) in raid 0. 14TB Total.
Display(s) Dell 23.5" 1440P IPS panel (P2416D)
Case Lian Li LANCOOL II MESH Performance Mid-Tower
Audio Device(s) Logitech Z623
Power Supply Gigabyte 850w (ud850gm pg5)
Mouse Some piece of shit from China
Keyboard Some piece of shit from China
Software Yes Please?
DLSS does help with VRAM issues..... I had to use DLSS in Hogwart's to get a playable experience, and it did result in its VRAM usage decreasing. Even in games where VRAM isn't the issue though, my VRAM usage absolutely goes down if/when I turn on DLSS.

Example below:

This is Cyberpunk 2077 at 4K RT Overdrive - no DLSS:

15.4/16 GB VRAM allocation


12.5/16 GB VRAM allocation with DLSS Quality enabled


11.9 GB/16 GB VRAM allocation: DLSS Balanced


10.9 GB/16 GB VRAM allocation: Ultra Performance

No matter how powerful my GPU might be, it would be dead in the water at 4k if it had 12 GB VRAM instead of 16 GB, even with 16GB, there's a chance that at 4K it'd exceed my 16 GB and start using RAM.

But with DLSS on Balanced, Performance, or Ultra Performance, allocation drops below 12 GB. DLSS can absolutely help GPU's with lower amounts of RAM to play at higher "output" resolutions.

If the RTSS overlay seems misleading due to labeling it as a 4090, and showing 16 GB VRAM, it's because this is a laptop. That is actually a desktop 4080 with slower GDDR6 (instead of GDDR6X) and a 175 wattage limit put in place.
Well you should be looking at usage and not allocation, because if there's less vram than what the game would like to allocate, it will just allocate less. Also you give people like las ammunition to be all ' everybody has enough vram, its just that games allocate too much' or whatever it was. But yes dlss does indeed reduce vram use, I remember that was the only way to get the last of us to work on my 3070. Had to use balanced at 1440p on high to get under 8gb... With no dlss it would crash. And I realized 8gb was not long for this world and upgraded.
 
Last edited:
Joined
Jul 13, 2016
Messages
2,845 (1.00/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
DLSS does help with VRAM issues..... I had to use DLSS in Hogwart's to get a playable experience, and it did result in its VRAM usage decreasing. Even in games where VRAM isn't the issue though, my VRAM usage absolutely goes down if/when I turn on DLSS.

Example below:

This is Cyberpunk 2077 at 4K RT Overdrive - no DLSS:

15.4/16 GB VRAM allocation


12.5/16 GB VRAM allocation with DLSS Quality enabled


11.9 GB/16 GB VRAM allocation: DLSS Balanced


10.9 GB/16 GB VRAM allocation: Ultra Performance

No matter how powerful my GPU might be, it would be dead in the water at 4k if it had 12 GB VRAM instead of 16 GB, even with 16GB, there's a chance that at 4K it'd exceed my 16 GB and start using RAM.

But with DLSS on Balanced, Performance, or Ultra Performance, allocation drops below 12 GB. DLSS can absolutely help GPU's with lower amounts of RAM to play at higher "output" resolutions.

If the RTSS overlay seems misleading due to labeling it as a 4090, and showing 16 GB VRAM, it's because this is a laptop. That is actually a desktop 4080 with slower GDDR6 (instead of GDDR6X) and a 175 wattage limit put in place.

DLSS helped your frame-rate in the above examples but it did not solve an issue your card doesn't have. As your first screenshot shows, the game takes 15GB at ultra + pathtracing + 4K whereas your card has 16GB.

DLSS is lowering the VRAM used as it lowers the internal render resolution but in practice none of the 4000 series cards actually benefit from that reduction in usage. The 4090 and 4080 both have enough VRAM to where that isn't an issue. Meanwhile the 4070 get's a mere 2 FPS in the same scenario you presented above (4K + Pathtracing + Ultra) so there is no scenario where that's playable on anything below a 4080 (and I'd question whether it's even an enjoyable experience on even the $1,200 4080) regardless of whether DLSS is used or not. If we drop the resolution to 1440p Ultra with no RT we get 58 FPS on the 4070 and VRAM usage at 9.8GB. Once again, the game's VRAM usage is below the card's VRAM allowance so there's no memory size issue here for DLSS to fix in this example.

Now if you move even further down the stack to the 4060, a card most likely to benefit from said VRAM reduction given it's 8GB, the problem becomes that the card is not capable of reasonably running the game in any memory size bound scenario. For example, 1080p ultra path tracing uses 10GB of VRAM and the 4060 can only get 32 FPS. Not only would enabling DLSS result in a large loss in visual quality due to the base 1080 render resolution, it also wouldn't solve the lack of VRAM issue as memory savings at lower resolutions are smaller. You only drop from 10GB to 9.7 GB by lowering the resolution to 720p from 1080p and by extension DLSS doesn't not solve the issue. On top of that, Nvidia's SKUs below the 4080 all have a lack of memory bandwidth. The 4080, 4070 Ti, 4070, 4060 Ti, and 4060 all have lower bandwidth as compared to their predecessors (even the 4090 is barely an increase in bandwidth over the 3090). The problem for those cards below the 4080 is two fold, lack of bandwidth and lack of VRAM.

In addition, consider that not every game has VRAM requirements scale with resolution as much as CP2077 does. Most of the other games coming out only have a 500 MB - 1GB difference between 1080p and 4K. The bulk of VRAM usage in most modern games is not from resolution but from an increasingly large asset base that games need stored in memory.

I understand what you are saying, DLSS does indeed lower the VRAM usage but in practice it doesn't provide any practical scenarios where it's actually going to enable a card to overcome it's lack of VRAM. This is entirely by design.
 
Last edited:
Joined
Oct 15, 2011
Messages
1,985 (0.43/day)
Location
Springfield, Vermont
System Name KHR-1
Processor Ryzen 9 5900X
Motherboard ASRock B550 PG Velocita (UEFI-BIOS P3.40)
Memory 32 GB G.Skill RipJawsV F4-3200C16D-32GVR
Video Card(s) Sapphire Nitro+ Radeon RX 6750 XT
Storage Western Digital Black SN850 1 TB NVMe SSD
Display(s) Alienware AW3423DWF OLED-ASRock PG27Q15R2A (backup)
Case Corsair 275R
Audio Device(s) Technics SA-EX140 receiver with Polk VT60 speakers
Power Supply eVGA Supernova G3 750W
Mouse Logitech G Pro (Hero)
Software Windows 11 Pro x64 23H2
2016 4-8gb was common but a lot of cheap cards have 6gb (1060) and 1070 8gb was the best buy i think
2018 average was 8gb
3 GB was common with GTX 1060 in 2016-2017. Possibly 2018 as well.
 
Joined
Oct 14, 2007
Messages
633 (0.10/day)
Location
Shelby Township, MI
System Name MSI GT77HX
Processor Intel i9-13980HX
Memory 64 GB DDR5 @ 4800 mhz
Video Card(s) NVIDIA RTX 4090 Mobile
Storage 2 TB 980 Pro
Display(s) 4K/144 Hz Mini-LED
Benchmark Scores 23,616 Timespy Graphics
DLSS helped your frame-rate in the above examples but it did not solve an issue your card doesn't have. As your first screenshot shows, the game takes 15GB at ultra + pathtracing + 4K whereas your card has 16GB.

DLSS is lowering the VRAM used as it lowers the internal render resolution but in practice none of the 4000 series cards actually benefit from that reduction in usage. The 4090 and 4080 both have enough VRAM to where that isn't an issue. Meanwhile the 4070 get's a mere 2 FPS in the same scenario you presented above (4K + Pathtracing + Ultra) so there is no scenario where that's playable on anything below a 4080 (and I'd question whether it's even an enjoyable experience on even the $1,200 4080) regardless of whether DLSS is used or not. If we drop the resolution to 1440p Ultra with no RT we get 58 FPS on the 4070 and VRAM usage at 9.8GB. Once again, the game's VRAM usage is below the card's VRAM allowance so there's no memory size issue here for DLSS to fix in this example.

Now if you move even further down the stack to the 4060, a card most likely to benefit from said VRAM reduction given it's 8GB, the problem becomes that the card is not capable of reasonably running the game in any memory size bound scenario. For example, 1080p ultra path tracing uses 10GB of VRAM and the 4060 can only get 32 FPS. Not only would enabling DLSS result in a large loss in visual quality due to the base 1080 render resolution, it also wouldn't solve the lack of VRAM issue as memory savings at lower resolutions are smaller. You only drop from 10GB to 9.7 GB by lowering the resolution to 720p from 1080p and by extension DLSS doesn't not solve the issue. On top of that, Nvidia's SKUs below the 4080 all have a lack of memory bandwidth. The 4080, 4070 Ti, 4070, 4060 Ti, and 4060 all have lower bandwidth as compared to their predecessors (even the 4090 is barely an increase in bandwidth over the 3090). The problem for those cards below the 4080 is two fold, lack of bandwidth and lack of VRAM.

In addition, consider that not every game has VRAM requirements scale with resolution as much as CP2077 does. Most of the other games coming out only have a 500 MB - 1GB difference between 1080p and 4K. The bulk of VRAM usage in most modern games is not from resolution but from an increasingly large asset base that games need stored in memory.

I understand what you are saying, DLSS does indeed lower the VRAM usage but in practice it doesn't provide any practical scenarios where it's actually going to enable a card to overcome it's lack of VRAM. This is entirely by design.
I'm sure it varies game to game as to how much benefit DLSS actually has for VRAM-limitations. In this scenario, I can play at basically any DLSS preset (I'd probably pick performance, which isn't pictured here, but lands between the 49 fps of Balanced and the 94 of Ultra Performance), but can't play without DLSS. But without DLSS isn't a VRAM issue, it's just a "Game's too damn demanding" issue.

If I had a 4070 Ti though, and wanted to comfortably get under 12 GB usage, DLSS Performance and Ultra Performance would do that. DLSS Balanced showed 11.9 GB usage for me, technically under 12 GB, but I could very well see that exceeding 12 GB from time to time, so wouldn't recommend that even on a 4070 Ti, but one preset below that should comfortably stay under 12 GB usage, and when a 4070 Ti isn't running out of VRAM it performs very similarly to my 4090 Mobile.

Pretty sure there's a lot of games as well where using DLSS makes 1440P more sensible on GPU's like the 3070/3070 Ti/4060/4060 Ti as well, where they might go over 8 GB usage at native 1440P, but don't when upscaling from 1080P.

EDIT: Tried testing at 1440P to see how that pans out.

40 fps at native 1440P, over 12 GB VRAM allocation. 70 fps with about 11 GB allocated at DLSS Quality. Definitely seems like a scenario where DLSS Quality enables pretty solid gameplay for a 4070 Ti by shoving the VRAM allocation under 12 GB. Especially since if I turn on Frame Gen, it's around 115 fps at 1440P DLSS Quality. I'd expect very similar performance from a 4070 Ti, while without DLSS I'd expect it to get crippled.

8 GB GPU's though, with path-tracing, cannot be saved by DLSS. Even at 1080P and DLSS Ultra Performance I'm showing 10 GB allocation.
 
Last edited:
Joined
Sep 5, 2004
Messages
1,956 (0.27/day)
Location
The Kingdom of Norway
Processor Ryzen 5900X
Motherboard Gigabyte B550I AORUS PRO AX 1.1
Cooling Noctua NB-U12A
Memory 2x 32GB Fury DDR4 3200mhz
Video Card(s) PowerColor Radeon 5700 XT Red Dragon
Storage Kingston FURY Renegade 2TB PCIe 4.0
Display(s) 2x Dell U2412M
Case Phanteks P400A
Audio Device(s) Hifimediy Sabre 9018 USB DAC
Power Supply Corsair AX850 (from 2012)
Software Windows 10?
Joined
May 3, 2019
Messages
1,466 (0.80/day)
System Name BigRed
Processor I7 12700k
Motherboard Asus Rog Strix z690-A WiFi D4
Cooling Noctua D15s/MX6
Memory TEAM GROUP 32GB DDR4 4000C16 B die
Video Card(s) MSI RTX 3080 Gaming Trio X 10GB
Storage M.2 drives-Crucial P5 500GB 4x4/WD SN850X 4TB 4x4/WD SN850X 2TB 4x4
Display(s) Dell s3422dwg 34" 3440x1440p 144hz ultrawide
Case Corsair 7000D
Audio Device(s) Topping D10s DAC/PCamp TC 1680 AMP/MS M10 Speakers/Bowers and Wilkins P7 Headphones
Power Supply Corsair RM850x 80% gold
Mouse Logitech G604 wireless
Keyboard Logitech G413 carbon
Software Windows 10 Pro
Benchmark Scores Who cares
I think to a certain degree, they play to the bigger is better, knowing full well, there are PC owners that will just buy the card with most ram and highest specs, whether they feel they really need it or not, hence the 24gb 4090, which would have probably been just as good with 16.
 
Joined
Oct 14, 2007
Messages
633 (0.10/day)
Location
Shelby Township, MI
System Name MSI GT77HX
Processor Intel i9-13980HX
Memory 64 GB DDR5 @ 4800 mhz
Video Card(s) NVIDIA RTX 4090 Mobile
Storage 2 TB 980 Pro
Display(s) 4K/144 Hz Mini-LED
Benchmark Scores 23,616 Timespy Graphics
Eh, games already push 16, frequently, at 4K. If the 4090 had 16 GB I'd genuinely consider that unacceptable for a card that expensive. It might not need 24 GB today, but it makes it a lot more future proofed.
 
Joined
Jul 13, 2016
Messages
2,845 (1.00/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
EDIT: Tried testing at 1440P to see how that pans out.

40 fps at native 1440P, over 12 GB VRAM allocation. 70 fps with about 11 GB allocated at DLSS Quality. Definitely seems like a scenario where DLSS Quality enables pretty solid gameplay for a 4070 Ti by shoving the VRAM allocation under 12 GB. Especially since if I turn on Frame Gen, it's around 115 fps at 1440P DLSS Quality. I'd expect very similar performance from a 4070 Ti, while without DLSS I'd expect it to get crippled.

8 GB GPU's though, with path-tracing, cannot be saved by DLSS. Even at 1080P and DLSS Ultra Performance I'm showing 10 GB allocation.

Allocation is not utilization. CP2077 with Path tracing enabled on ultra uses 11.5GB:

1697331230177.png


VRAM usage in the scenario presented is just below the max the card so DLSS is not providing VRAM size help as it already fits without DLSS enabled. Mind you memory allocation exceeding VRAM size isn't always immediately going to cause issues. It depends how much of that allocated memory is hot, as in frequently used by the game engine, and how good the game engine is at streaming assets between the memory and VRAM (of course this can only mitigate the problem to a certain extent). You might be able to allocate 0.5 to 3.0 GB over VRAM size and depending on the game It might be fine with it.

Eh, games already push 16, frequently, at 4K. If the 4090 had 16 GB I'd genuinely consider that unacceptable for a card that expensive. It might not need 24 GB today, but it makes it a lot more future proofed.

Yes, 16GB would be unacceptable for the 4090 given that class of card is expected to last a few generations and many prosumers buy it who have memory heavy workloads. There are a couple of games on the market that can tap up to 18GB. 24GB gives it buffer for future titles.
 
Joined
Oct 14, 2007
Messages
633 (0.10/day)
Location
Shelby Township, MI
System Name MSI GT77HX
Processor Intel i9-13980HX
Memory 64 GB DDR5 @ 4800 mhz
Video Card(s) NVIDIA RTX 4090 Mobile
Storage 2 TB 980 Pro
Display(s) 4K/144 Hz Mini-LED
Benchmark Scores 23,616 Timespy Graphics
Allocation is not utilization. CP2077 with Path tracing enabled on ultra uses 11.5GB:

View attachment 317517

VRAM usage in the scenario presented is just below the max the card so DLSS is not providing VRAM size help as it already fits without DLSS enabled. Mind you memory allocation exceeding VRAM size isn't always immediately going to cause issues. It depends how much of that allocated memory is hot, as in frequently used by the game engine, and how good the game engine is at streaming assets between the memory and VRAM (of course this can only mitigate the problem to a certain extent). You might be able to allocate 0.5 to 3.0 GB over VRAM size and depending on the game It might be fine with it.



Yes, 16GB would be unacceptable for the 4090 given that class of card is expected to last a few generations and many prosumers buy it who have memory heavy workloads. There are a couple of games on the market that can tap up to 18GB. 24GB gives it buffer for future titles.
It's not that simple. You can't measure it in one specific spot and declare that's what the game uses at those settings. It varies.
 
Last edited:
Joined
Aug 25, 2023
Messages
126 (0.51/day)
System Name Favourite toy(s)
Processor Ryzen 5 7600X lapped & Ryzen 7 7700
Motherboard Asrock X670E Steel Legend / Gigabyte B650 Aorus Elite
Cooling Deep Cool AK620 / Stock cooler
Memory G.Skill F5-5600J3036D16GX2-FX5 / Corsair Vengeance CMH32GX5M2B5600C36
Video Card(s) RX 6800 XT factory overclocked / iGPU
Storage 1 + 2TB T-Force Cardea A440 pro / 2 x Kingston KC3000 1TB
Display(s) Samsung G5 Odyssey / Philips 278E9Q
Case MSI MPG Sekira 100R / Silverstone Redline mATX
Audio Device(s) Asus Xonar AE 7.1 + AT -AD500X / Onboard + Creative 2.1 soundbar
Power Supply Corsair RM1000x V2 / Corsair RM750x V2
Mouse MSI Clutch GM20 Elite / CM Reaper
Keyboard Logitech G512 Carbon / MS Internet keyboard
If your into modding games, visual or graphical mods will of course add to the VRAM burden as well, so I don't think that should be forgotten in this discussion.
 
Joined
Jul 13, 2016
Messages
2,845 (1.00/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
It's not that simple. You can't measure it in one specific spot and declare that's what the game uses at those settings. It varies.

Those numbers were provided by TPU's review of phantom liberty. They are not just a figure from a single spot. You seem to be implying that your methodology is better than TPUs when literally all of your figures are in fact from a specific spot. You almost certainly didn't isolate variables or setup a proper benchmark environment either. If you are going to call out professional reviews, you had best make sure your own values don't have the very problem you are criticiszing.
 
Joined
Oct 14, 2007
Messages
633 (0.10/day)
Location
Shelby Township, MI
System Name MSI GT77HX
Processor Intel i9-13980HX
Memory 64 GB DDR5 @ 4800 mhz
Video Card(s) NVIDIA RTX 4090 Mobile
Storage 2 TB 980 Pro
Display(s) 4K/144 Hz Mini-LED
Benchmark Scores 23,616 Timespy Graphics
Those numbers were provided by TPU's review of phantom liberty. They are not just a figure from a single spot. You seem to be implying that your methodology is better than TPUs when literally all of your figures are in fact from a specific spot. You almost certainly didn't isolate variables or setup a proper benchmark environment either. If you are going to call out professional reviews, you had best make sure your own values don't have the very problem you are criticiszing.
I didn't say my numbers were better. I'm saying any specific value is not necessarily the highest amount it will ever use. If a game usually uses 11 GB but sometimes uses 12.5, that's still an issue for the GPU. What matters is the highest usage, not the average usage, or the lowest usage.
 
D

Deleted member 185088

Guest
You've got to be trolling - or blind. Literally not worth debating with someone who thinks Cyberpunk 2077 looks like a PS1 game. The game is gorgeous, so is Control.
I'm sorry if you love the game but geometry and textures are way more important for realism than lighting UE5 achieves both, if a game stil uses geometry like this RT won't help and this from the path tracing version of the game:
IMG_20230411_215242.jpg

Control has the same issue with worse textures. With what UE5 achieves my expectations are higher.
 
Joined
Jun 19, 2023
Messages
59 (0.19/day)
System Name EnvyPC
Processor Ryzen 7 7800 X3D
Motherboard Asus ROG Strix B650E-I Gaming WiFi
Cooling Thermalright Peerless Assassin 120 White / be quiet! Silent Wings Pro 4 120mm (x4)
Memory 32GB Team Group T-Force Delta RGB DDR5-6000 CL30
Video Card(s) MSI GeForce RTX 4080 16GB Gaming X Trio White Edition
Storage SK Hynix Platinum 2TB NVME
Display(s) HP Omen 4K 144Hz
Case InWin A3 White
Power Supply Corsair SF750 Platinum
Software Windows 11 Pro
While DLSS doesn't help VRAM issues

in practice it doesn't provide any practical scenarios where it's actually going to enable a card to overcome it's lack of VRAM.

Wat?! Says who? :wtf:

DLSS/FSR/XeSS and the like absolutely help VRAM strapped cards. As someone who was on "Team 8GB" up until last week, upscaling is what let me play games like Far Cry 6 at 4K ultra + RT without everything turning into a slideshow. Rendering at a lower internal res dropped my VRAM usage by ~2GB.
 
Last edited:
Joined
Sep 1, 2020
Messages
2,033 (1.52/day)
Location
Bulgaria
I wait for times when in games will be included and enabled all ray tracing effects. In my theory for good FPS will need 2-3 times higher RT performance than RTX 4090 and more VRAM. I think not less than 16GB for 4K and not less than 32GB for 8K.
 
Joined
Sep 21, 2020
Messages
1,498 (1.14/day)
Processor 5800X3D -30 CO
Motherboard MSI B550 Tomahawk
Cooling DeepCool Assassin III
Memory 32GB G.SKILL Ripjaws V @ 3800 CL14
Video Card(s) ASRock MBA 7900XTX
Storage 1TB WD SN850X + 1TB ADATA SX8200 Pro
Display(s) Dell S2721QS 4K60
Case Cooler Master CM690 II Advanced USB 3.0
Audio Device(s) Audiotrak Prodigy Cube Black (JRC MUSES 8820D) + CAL (recabled)
Power Supply Seasonic Prime TX-750
Mouse Logitech Cordless Desktop Wave
Keyboard Logitech Cordless Desktop Wave
Software Windows 10 Pro
It's undisputable that game requirements will only go up in time. They always have. Since their inception, video games have driven the development of new technologies and advancements in hardware. With the transition of the medium to 3D in the mid-90s, gamers have come to expect more realistic, more detailed graphics, richer environments, and spectacular visual effects -- all in increasingly higher resolutions.

Major studios have always developed their games on cutting-edge hardware. A dev's workstation is usually far more powerful than what Joe the gamer has at home. Unfortunately, as of lately, we have seen an increasing number of titles which run poorly not just on an average PC, but even on a high-end config. It seems that optimizing their latest release, so that it runs acceptably on weaker components, isn't a priority for many software houses today. This issue is worth another discussion, but it's an industry-wide problem which lies with the developers, studio/project managers, publishers, video card manufacturers, and the gamers themselves.

And let's not forget that games are developed for the current generation of consoles first and foremost. Both XSX and PS5 sport eight core CPUs accompanied by 16 GB of VRAM, and most of these resources are available to the developer. Major games are created to run comfortably on console equivalent hardware, not on Johnny's "gaming" laptop with a mobile GTX1660 or RTX3050. Future consoles will have even more memory and more powerful GPUs -- and we're already half way through with the current gen.

As for VRAM requirements, the need for a larger frame buffer is undeniable with modern titles. While many games can still be played on an 8 GB card in 1080p, maxing out the details, especially in higher resolutions, will often call for 12 GB or more. I regularly see 14-16 GB maximum consumption in 4K with titles that are a couple years old at this point. Shadow of the Tomb Raider, a 2018 game, will show beyond 15 GB of used dedicated VRAM, as reported per process. Even good old GTA5 -- an eight year old game -- will peak at over 9 GB with everything cranked up.

That said, gaming on the PC gives us the great opportunity to experiment, tweak the settings, and find out their impact on performance. IMO we should always try to strike the kind of balance between visual fidelity and fps that we find comfortable. For example, gamers were outraged to learn Immortals of Aveum's requirements when the game launched. Running on UE5.1, this is one of the most demanding titles that released this year:

1697370494789.png


The game has received a number of patches post-launch, and future updates have been announced. I played this on a secondary PC with a Ryzen 3300X, a CPU well below the minimum 8c/16t AMD requirement. My GPU -- a 6600XT -- while matching the minimum spec of a 5700XT, was supposedly only good for 720p60 (1080p upscaled on quality preset), with everything on low.

Despite the excessive official requirements, everything played very smoothly and felt very responsive on this budget config. I ran Immortals in native 1920x1200 with all settings on high, which is the middle preset. Only shadows, cinematics DoF and cinematics motion blur were set to low. The fps hovered around 60 most of the time, with dips into the 40s. The only occasional drops below 40 fps were in some cutscenes and heavy battles with multiple opponents.

In all, I believe that the people who keep complaining about growing hardware requirements are either:
- those who have unrealistic expectations of their PCs (in no small part thanks to marketing ploys used by hardware manufacturers and game publishers)
- the ones who are too lazy to get familiar with the game settings to try and match them to their PC's specs
- and the ones who absolutely must play every major release at launch

And honestly, if your system can't handle that latest hyped AAA title, maybe try playing an indie or an older game instead? There are dozens of great games that will run perfectly on a 10 year old PC :)
 
Last edited:
Joined
Mar 16, 2017
Messages
1,676 (0.64/day)
Location
Tanagra
System Name Budget Box
Processor Xeon E5-2667v2
Motherboard ASUS P9X79 Pro
Cooling Some cheap tower cooler, I dunno
Memory 32GB 1866-DDR3 ECC
Video Card(s) XFX RX 5600XT
Storage WD NVME 1GB
Display(s) ASUS Pro Art 27"
Case Antec P7 Neo
Still rocking Office XP pro at home for exactly that reason. There is no snappier Word... Even when files get large (>300 pages full of tables and whatnot), XP remains snappy. 365 & Word Online will choke completely on this; and even the offline version is ridiculously slow, you can just feel it choke on large text files. If its really bad, you'll even get a constant intermittent loading cursor, like once every second, continuously, and you can't even work proper. We're even moving to splitting the files in half now at work, its that bad.
For as much as I like Excel, I despise Word. It almost feels as if it was designed to work against you. I do have a copies of XP and 2000 in storage, but I don't even use Office at home. I'm stuck with the ever-updating MS365 at work. Something changes every month, often seemingly just for the sake of change.
 
Joined
Dec 28, 2012
Messages
3,481 (0.84/day)
System Name Skunkworks
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software openSUSE tumbleweed/Mint 21.2
So why can't we just use the Reader from 2020 until there's a major quality update? Why update even when there's absolutely no need to?
Security. PDFs are a GREAT way to infect PCs and people always underestimate how complicated PDFs are. They can effectively be made tiny OSes if you are willing to build it.

For as much as I like Excel, I despise Word. It almost feels as if it was designed to work against you. I do have a copies of XP and 2000 in storage, but I don't even use Office at home. I'm stuck with the ever-updating MS365 at work. Something changes every month, often seemingly just for the sake of change.
At work we use google suite since MS has this hard on for removing macros, which is like the only thing excel has going for it anymore. Office peaked in 2007 IMO.
 
Top