• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Are game requirements and VRAM usage a joke today?

I'll give you two PT screenshots in Cyberpunk to illustrate my earlier example
Sure I mean, in that example I get your point, and it's transformed but more subtly, it's however very easy to find many other examples that are far more changed, far less subtle. Also Portal and quake 2, insanely transformative differences in the fully path traced versions. Reflections alone are a massive uptick personally as I notice SSR fall apart very easily.

We can cherry pick examples all day long to support either side of the coin, what I'm more getting at is I've seen PT and even RT transform a game a hell of a lot more than I have texture packs, the benefit of the texture packs is they're near enough to free performance wise if you have the VRAM for them. Almost a moot comparison, apples and oranges.
 
Sure I mean, in that example I get your point, and it's transformed but more subtly, it's however very easy to find many other examples that are far more changed, far less subtle. Also Portal and quake 2, insanely transformative differences in the fully path traced versions. Reflections alone are a massive uptick personally as I notice SSR fall apart very easily.

We can cherry pick examples all day long to support either side of the coin, what I'm more getting at is I've seen PT and even RT transform a game a hell of a lot more than I have texture packs, the benefit of the texture packs is they're near enough to free performance wise if you have the VRAM for them. Almost a moot comparison, apples and oranges.
Yeah they don't compare in that sense. Texture is straight upgrade (IF the pack actually increases fidelity) but same image. RT/PT can change its look, (color)tone, atmosphere, even if its the same image, and even with varying texture detail levels.

They both cost VRAM, I guess :D

I'm not trying to cherry pick examples by the way, more just trying to illustrate what I mean. There's no right or wrong to me in this, there is only a scale of balancing out perceived IQ versus wanted performance, right...
 
More VRAM means... more VRAM.
I can use it to run multiple things that consume VRAM, sometimes I like to run 2-3 games at the same time to do different things in them - they consume VRAM.
I have 64 GB of RAM, I use RAMDisks and everything works flawlessly because everything is in RAM/VRAM, and nothing is in the page file.
So, I use that and it works for me.

You prefer RT instead of VRAM - fine, but can you see the visible difference between raster shadows and RT shadows while running fast and shooting at NPCs/players? No?
Yes, I know you can see the difference when sitting in one place and taking screenshots or/and looking between 2 screenshots.

When I played Control, I tried using RT effects a few times (begining/middle/end game) - but turning every surface into a mirror is an extremely terrible idea, sunlight turning into fog, etc. I tried to play with full RT or a part RT effects few times, but... Sorry, but that's really wrong. So I support AMD by using lightweight RT - where only needed! This can make the graphics better and there is no need for FPS to drop by 50-80%.

So if you're ok with this Nvidia gimmick - heavy RT everywhere - well, I'm not.
 
More VRAM means... more VRAM.
I can use it to run multiple things that consume VRAM,

I agree but the Radeons VRAM is useless in 95% of the apps out there.
While at the same time, CUDA+VRAM is a killing combination and that's why we have to pay more for green cards with just a tad of more than usual vram amount.
 
More VRAM means... more VRAM.
I can use it to run multiple things that consume VRAM, sometimes I like to run 2-3 games at the same time to do different things in them - they consume VRAM.
I have 64 GB of RAM, I use RAMDisks and everything works flawlessly because everything is in RAM/VRAM, and nothing is in the page file.
So, I use that and it works for me.

You prefer RT instead of VRAM - fine, but can you see the visible difference between raster shadows and RT shadows while running fast and shooting at NPCs/players? No?
Yes, I know you can see the difference when sitting in one place and taking screenshots or/and looking between 2 screenshots.

When I played Control, I tried using RT effects a few times (begining/middle/end game) - but turning every surface into a mirror is an extremely terrible idea, sunlight turning into fog, etc. I tried to play with full RT or a part RT effects few times, but... Sorry, but that's really wrong. So I support AMD by using lightweight RT - where only needed! This can make the graphics better and there is no need for FPS to drop by 50-80%.

So if you're ok with this Nvidia gimmick - heavy RT everywhere - well, I'm not.
Behold, in the midst of the mud a jewel emerges, an amalgam of divine wisdom and difficult-to-swallow truth.
 
I agree but the Radeons VRAM is useless in 95% of the apps out there.
While at the same time, CUDA+VRAM is a killing combination and that's why we have to pay more for green cards with just a tad of more than usual vram amount.
If you need more VRAM, like me - Radeon.
If you need CUDA - Nvidia.
I'm agree with you.
 
More VRAM means... more VRAM.
I can use it to run multiple things that consume VRAM, sometimes I like to run 2-3 games at the same time to do different things in them - they consume VRAM.
I have 64 GB of RAM, I use RAMDisks and everything works flawlessly because everything is in RAM/VRAM, and nothing is in the page file.
So, I use that and it works for me.

You prefer RT instead of VRAM - fine, but can you see the visible difference between raster shadows and RT shadows while running fast and shooting at NPCs/players? No?
Yes, I know you can see the difference when sitting in one place and taking screenshots or/and looking between 2 screenshots.

When I played Control, I tried using RT effects a few times (begining/middle/end game) - but turning every surface into a mirror is an extremely terrible idea, sunlight turning into fog, etc. I tried to play with full RT or a part RT effects few times, but... Sorry, but that's really wrong. So I support AMD by using lightweight RT - where only needed! This can make the graphics better and there is no need for FPS to drop by 50-80%.

So if you're ok with this Nvidia gimmick - heavy RT everywhere - well, I'm not.
I must be blind. Every time someone posts screenshots of RT to prove how "amazing" it is, I cant see it. They're the same picture.

So I'm fine leaving RT off.
 
I leave it on because it looks good :D
 
You prefer RT instead of VRAM - fine, but can you see the visible difference between raster shadows and RT shadows while running fast and shooting at NPCs/players? No?
Yes, I know you can see the difference when sitting in one place and taking screenshots or/and looking between 2 screenshots.
Beautifully cherry picked I must say, of all the countless scenario's where one would notice the difference, you pick an example/effect to make your... uhhh.... Point?
I support AMD by using lightweight RT
These are literally the most meh, in terms of RT effects/experiences, while still costing a significant amount of performance, and you applaud it? like I can fully appreciate sometimes Nvidia take it too far because they leaned into it (and you get a far more obvious difference), but AMD RT titles... I mean their RT implementations aren't known for the striking visuals.
I must be blind. Every time someone posts screenshots of RT to prove how "amazing" it is, I cant see it. They're the same picture.
You might do well to try playing RT games yourself. Your hardware is capable of it why not try? I get that not running it very fast also hurts the prospect, but if you literally can't see any difference, you might be blind. Being amazed is up to you I guess, but they're most certainly different images being rendered.
 
Beautifully cherry picked I must say, of all the countless scenario's where one would notice the difference, you pick an example/effect to make your... uhhh.... Point?

These are literally the most meh, in terms of RT effects/experiences, while still costing a significant amount of performance, and you applaud it? like I can fully appreciate sometimes Nvidia take it too far because they leaned into it (and you get a far more obvious difference), but AMD RT titles... I mean their RT implementations aren't known for the striking visuals.
My point is 90% of the gameplay in every game.
Because, the player can stand and admire the landscapes from time to time, but most of the time he plays...

AMD offers weak RT - that's right, and because of that they use light RT effects - and here is the trick - this doesn't mean - go and buy Radeon, no you can buy what you want. But I prefer game developers to not kill FPS because few mirrors and shadows.
 
Beautifully cherry picked I must say, of all the countless scenario's where one would notice the difference, you pick an example/effect to make your... uhhh.... Point?

These are literally the most meh, in terms of RT effects/experiences, while still costing a significant amount of performance, and you applaud it? like I can fully appreciate sometimes Nvidia take it too far because they leaned into it (and you get a far more obvious difference), but AMD RT titles... I mean their RT implementations aren't known for the striking visuals.

You might do well to try playing RT games yourself. Your hardware is capable of it why not try? I get that not running it very fast also hurts the prospect, but if you literally can't see any difference, you might be blind. Being amazed is up to you I guess, but they're most certainly different images being rendered.
I guess both companies try to encourage what RT works best with their hardware. AMD with light effects, like shadows, while Nvidia brute forcing every surface to be a mirror. Neither approach is good, in my opinion. One is hardly distinguishable from traditional rendering, and is thus, pointless, and the other, while has the wow-factor, doesn't look real at all.

In my opinion, better hardware is needed both on red and green sides so that devs can play around RT to make games actually look better, and not just turn them into mirror simulators for people with high-end cards.
 
My point is 90% of the gameplay in every game.
Interesting, because I get immersion the entire time playing a game with RT global illumination, reflections that don't break when panning, shadows that don't pop in and out of LOD distances, perfect ambient occlusion grounding objects in the game world, and so on. So I'd say that I'm noticing and appreciating good RT the entire time I play when the effects amount to something transformative.
Nvidia brute forcing every surface to be a mirror....... and not just turn them into mirror simulators
While I have seen some demo's and like teaser footage that emphasises this, and some games really lean into reflections by making everything reflective (CP 2077 lol), I'd disagree that [all] Nvidia sponsored titles are simply reduced to 'mirror simulators', but certainly the ones with multiple good, heavy, transformative effects (or PT)bare more so for people with high end cards. I have a feeling the next generation of consoles and beyond is where RT for the masses will truly be in full swing.
 
Interesting, because I get immersion the entire time playing a game with RT global illumination, reflections that don't break when panning, shadows that don't pop in and out of LOD distances, perfect ambient occlusion grounding objects in the game world, and so on. So I'd say that I'm noticing and appreciating good RT the entire time I play when the effects amount to something transformative.
I agree that some other effects are different and visible :)
 
Interesting I seen a 2023 round up video by DF, the first few games they shown had PS3 quality textures at launch on 8-10 gig VRAM GPUs, but then some patches later, performance was up, stuttering was down and most importantly textures were half respectable, most likely some non sensitive content was shifted to RAM and the 1.0 version was based on console unified model.

Playing FF16 now on PS5 in graphics mode and its a beauty, not perfect textures (characters up close blur, could be DoF or poor textures, but environment textures are good quality), but very good, no stutters, immersive 30fps (is a 60fps mode for those obsessed with latency) and how I think a RPG should be, in addition no pop in's or LOD issues. But I cant help but feel when this is on PC my 10 gig 3080 wont provide as good an experience.
 
I have never seen a PS5, but I have a hard time believing it would be better than 137K and 3080 :twitch:
 
I have never seen a PS5, but I have a hard time believing it would be better than 137K and 3080 :twitch:
I held one in my hands for a brief moment once, between winning it on a raffle and selling it straight after. :laugh:

Seriously, though, looking at its specs and price, I suspect roughly 8700K and 5700 non-XT level performance from it.
 
It's known that the nvidias gpus use their memory way more efficiently than the radeons.

I'd still get a 2080Ti instead of the 3070 though. It's like playing with fire with 8GB at 1440p.
 
It seems like this is the wrong time to buy a €1500 - 2000 gaming laptop?

Desktop is catching up with amount of VRAM while laptop is lagging behind (not a first). I think it's embarrasing that we can't buy a new €2000 laptop SKU with a bit more VRAM, even if it's probably not that easy to add physically after the board design is set, or they have to follow chip makers specifications. I'm not saying who's doing it wrong here.
The fact that laptop GPU's doesn't have the same dies as desktop GPU's with the same model number might be part of the problem.

AMD may have been more generous with having a bit more RAM on desktop, but they're no better than Nvidia on laptop, although it seems to be due to missing RDNA 3 laptop GPU's, rather than the amount of VRAM being set too low.

A 7800 variant still haven't been launched, let alone seen or leaked AFAIK, and the 7900M is in just one model (also not <€2000). AMD have nothing between 32 and 72 RDNA3 CU's for laptops, that's a huge gap.

Here are 676 models with 8 GB VRAM that's showed up during the last year, starting at €950 for an Alder Lake/4060.

Looking at >8 GB VRAM, there are 165 models, with the least expensive at €2200.

This is assuming you want that VRAM, and assuming you're looking for this generation of GPU's. You can obviously find better deals on older models, but we can't expect the same from those when it comes to price and amount of VRAM, like 3080 TI's with price cuts.

Oh and don't give me the "there will always be better models in the future", that's not what I'm talking about.. ;)
 
Last edited:
I held one in my hands for a brief moment once, between winning it on a raffle and selling it straight after. :laugh:

Seriously, though, looking at its specs and price, I suspect roughly 8700K and 5700 non-XT level performance from it.

Both consoles are 2070 super/2080 level or 6700 10G on the Radeon side just with more vram.... Oddly even with the extra vram texture quality is still trash in a lot of console games compared to the highest settings on PC.

My PS5 is almost always worse than my 6700XT/7600 in all aspects in games on both for example.
 
I know people want a lot of vram "ooo xxx isn't enough". The thing i see is the more there is the more companies think "well we can be lazy with optimizations since we got a ton memory". Console games made due with 16gb of shared memory. Most pc had near that in just dedicated gpu memory alone.
 
Console games made due with 16gb of shared memory. Most pc had near that in just dedicated gpu memory alone.
Steam says 5 % have more than 12 GB, 25 % have more than 8 GB. Dunno where you got your "most" from.

56 % have more than 6 GB, according to Steam.
 
Been playing FF15 today with 4k textures, absolutely glorious, needs 9 gig of VRAM, but at least the textures are the part to justify it. For me, textures make and break visuals, they above shadows, lighting, and rendering resolution on visual impact.

Both consoles are 2070 super/2080 level or 6700 10G on the Radeon side just with more vram.... Oddly even with the extra vram texture quality is still trash in a lot of console games compared to the highest settings on PC.

My PS5 is almost always worse than my 6700XT/7600 in all aspects in games on both for example.
For me I find with older games, PC rips apart consoles. SGSSAA and high res texture packs two of the main reasons.

However new triple AAA stuff? PCs are so bad on UE games, texture/shader streaming, unified memory code ported over lazily causing PS3 quality textures unless on a 16+ gig card.

Problem with PC settings now days, especially for textures, is they usually variable now, it is conditional on VRAM being available as well as CPU and I/O able to load it in, if not the LOD wont max out. According to DF some games are finally starting to precompile shaders on first run, which is a step in the right direction, but still more issues to solve.
 
Been playing FF15 today with 4k textures, absolutely glorious, needs 9 gig of VRAM, but at least the textures are the part to justify it. For me, textures make and break visuals, they above shadows, lighting, and rendering resolution on visual impact.


For me I find with older games, PC rips apart consoles. SGSSAA and high res texture packs two of the main reasons.

However new triple AAA stuff? PCs are so bad on UE games, texture/shader streaming, unified memory code ported over lazily causing PS3 quality textures unless on a 16+ gig card.

Problem with PC settings now days, especially for textures, is they usually variable now, it is conditional on VRAM being available as well as CPU and I/O able to load it in, if not the LOD wont max out. According to DF some games are finally starting to precompile shaders on first run, which is a step in the right direction, but still more issues to solve.

Wasn't trying to compare pc vs console both have a place was more just talking about the general rasterization performance of both.

The reason consoles do better on UE4/5 as far as shader compilation and traversal stutter go is they have dedicated silicon for decompression unlike pc where we have to brute force it a bit at least until direct storage or somthing similar becomes more prevalent.
 
Back
Top