• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Modded NVIDIA GeForce RTX 3070 With 16 GB of VRAM Shows Impressive Performance Uplift

Not true. Check this out

That was another video that made no sense at all, you shouldn't test that on cards with 24GB, it's like upgrading from 16GB of RAM to 32GB, all your applications suddenly report they are using more. And i'm not sure afterburner or any other software similar ever reported real usage for vram.
I don't get what is so difficult to test thing in realistic scenarios, not ultra settings, not 4090's at 1080p. It's infuriating.

I'm sorry but they are sellers, all of them, they want to sell you their videos, they make a living from our clicks, testing realistic scenarios is not click material.

Next we'll be testing Cyberpunk in 4k using a 3dfx, using no keyboard and one hand tied behind my back while drinking cola and eating pizza driving down the highway, because lols
 
Wouldn't it be nice to upgrade the ram on your GPU like we do on our mobo's : )
 
Wouldn't it be nice to upgrade the ram on your GPU like we do on our mobo's : )
No, extreme temperatures and especially frequent changes in a large range, around the GPU, and from the operation of the memories themselves, it is better to have the chips soldered.
 
That was another video that made no sense at all, you shouldn't test that on cards with 24GB, it's like upgrading from 16GB of RAM to 32GB, all your applications suddenly report they are using more. And i'm not sure afterburner or any other software similar ever reported real usage for vram.
I don't get what is so difficult to test thing in realistic scenarios, not ultra settings, not 4090's at 1080p. It's infuriating.

I'm sorry but they are sellers, all of them, they want to sell you their videos, they make a living from our clicks, testing realistic scenarios is not click material.

Next we'll be testing Cyberpunk in 4k using a 3dfx, using no keyboard and one hand tied behind my back while drinking cola and eating pizza driving down the highway, because lols

i think the whole point of the video is to see how much vram it uses and 4090 and 7900xtx are the current generation cards with most vram... from low settings 1080p and 4k to max settings 1080p and 4k
it might not be accurate but it gives an indication of how vram is used and how much new games require them for low and ultra settings
 
This just shows the bad value proposition of the RTX4070. True, the power comsumption has gone down significantly comparing RTX3070 to RTX4070. However, if the RTX3070 shows such performance with extra RAM, this confirms that what reviewers have been saying: the RTX4070 is actually a 4050 or 4060 at a higher price than the 3070 launched. I just sold my 3070 thinking of the 4070 but will actually wait for AMD's next launch.
 
Surely totally depends on res and settings. You don't need more than 8GB if you're using 1080p.

Can probably just tweak settings at higher resolutions to stay within it. Easier than modding hardware.
Sure, you can run on low as well, why even bother buying a new GPU at all right!

I don't get this sentiment at all. You pay through the nose for even a midrange GPU and then people are content with all sorts of quality reductions to keep it afloat. While there are ALSO similarly priced midrangers that don't force you into that, today or in the next three to five years.
Like... why even bother to begin with, just use your IGP and save money, 720p is after all just a setting eh.
 
Looks like 10GB is more than enough, since those games were pulling maximum ~9600MB usage for VRAM...
 
Looks like 10GB is more than enough, since those games were pulling maximum ~9600MB usage for VRAM...
I'm already seeing north of 12GB in TW WH3 at sub 4K (3440x1440).
The engine isn't even using any new tech, its just a big game.
 
i think the whole point of the video is to see how much vram it uses and 4090 and 7900xtx are the current generation cards with most vram... from low settings 1080p and 4k to max settings 1080p and 4k
it might not be accurate but it gives an indication of how vram is used and how much new games require them for low and ultra settings

A card with more vram will allocate more vram, and the usage numbers are not reliable, and is probably doing the same because they scale in the same way. You don't get any useful information this way.
 
A card with more vram will allocate more vram, and the usage numbers are not reliable, and is probably doing the same because they scale in the same way. You don't get any useful information this way.
Improving minimum frames and delta between minimum and maximum is useful information. Must be reason for this... Only one difference is VRAM size. Think! :)
 
Improving minimum frames and delta between minimum and maximum is useful information. Must be reason for this... Only one difference is VRAM size. Think! :)

what are we talking about now? Brian's video? i see different minimum and maximum, 0.1% lows using the same vram (24GB) and just using a different card, amd or nvidia.

the cards even allocate and use vram differently just changing brands and having the same vram size

How are you concluding anything about maximums and minimums based on vram size? talking about the 8GB comparisons, the 4k numbers are a bit irrelevant for this discussion.
 
I don't get this sentiment at all. You pay through the nose for even a midrange GPU and then people are content with all sorts of quality reductions to keep it afloat.
Whilst I agree it doesn't make sense to skimp on VRAM if you're paying +£600 for a new GPU intended for 2023-2025 AAA Ultra gaming, in reality many of us tweak settings anyway out of personal preference (I cannot stand Depth of Myopia, Chromatic Abhorration, "Supernova Bloom", stupefied head-bob, etc, effects) and think they look ridiculous far more than they have ever 'added realism'. And I'd still turn them off even if I owned a 65,536 Yottabyte VRAM GPU...
 
Whilst I agree it doesn't make sense to skimp on VRAM if you're paying +£600 for a new GPU intended for 2023-2025 AAA Ultra gaming, in reality many of us tweak settings anyway out of personal preference (I cannot stand Depth of Myopia, Chromatic Abhorration, "Supernova Bloom", stupefied head-bob, etc, effects) and think they look ridiculous far more than they have ever 'added realism'. And I'd still turn them off even if I owned a 65,536 Yottabyte VRAM GPU...

Agreed, but head bob is great though, realism!!! o_O:laugh:
 

Fresh and still hot

No AMD vs Nvidia in this video. Only 8GB vs 16GB, pure Nvidia.
 
I'm already seeing north of 12GB in TW WH3 at sub 4K (3440x1440).
The engine isn't even using any new tech, its just a big game.
It just caches the VRAM. Nothing new. Games have done that for more than 10 years now...
 
You know, it's amazing. People in this forum are jumping over themselves saying how the 3070 should have had 16GB at launch. Now, if nvidia had launched a 16gb 3070, with the associated $80+ price increase to cover the additional memory cost, the same people would REEEE about the price and how its too expensive and how nvidia is da big evil guy.
I'm already seeing north of 12GB in TW WH3 at sub 4K (3440x1440).
The engine isn't even using any new tech, its just a big game.
Allocation. :)Is. :)Not. :)Utilization.:)

We need that clapping hand emoji.

Unless you are seeing consistent microstuttering at higher framerates, you are not running out of VRAM. So far nobody has displayed that with TW3.
 
You know, it's amazing. People in this forum are jumping over themselves saying how the 3070 should have had 16GB at launch. Now, if nvidia had launched a 16gb 3070, with the associated $80+ price increase to cover the additional memory cost, the same people would REEEE about the price and how its too expensive and how nvidia is da big evil guy.

Allocation. :)Is. :)Not. :)Utilization.:)

We need that clapping hand emoji.

Unless you are seeing consistent microstuttering at higher framerates, you are not running out of VRAM. So far nobody has displayed that with TW3.

Allocation may not be utilization, but we've reached a point where games in general are beginning to no longer run adequately with 8 GB GPUs or 16 GB RAM PCs. People who make that argument often forgo or reject the concept of memory pressure. As physical memory nears exhaustion, data will be first compressed (which costs CPU cycles, but should still be manageable in most cases), and in order of priority, swapped onto slower memory tiers (whenever available) until it reaches storage, which is comparably glacial even on high-speed NVMe SSDs.

Apple offers an easy way to read memory pressure in the Mac OS's activity monitor, but Microsoft has yet to do something like this on Windows. A dead giveaway that you are short on RAM is when you begin to see the Compressed Memory figure rise (ideally you want this at 0 MB), and you'll be practically out of RAM when your commit charge exceeds your physical RAM capacity. This is also the reason why you will never see RAM usage maxed out in the Windows task manager, it will attempt to conserve about 1 GB of RAM for emergency use at all times, this leads many people with the mindset of "I paid for 16 GB of RAM and use 16 GB of RAM I shall" to think that their computers are doing OK and that they aren't short at all.

A similar concept applies to GPU memory allocation on Windows. As you may know by now, Windows doesn't treat memory as absolute values, but rather as an abstract concept of addressable pages instead, with the MB values reported by the OS being more estimates than a precise, accurate metric. Normally, the WDDM graphics driver will allocate physical memory present in the graphics adapter plus up to 50% of system RAM, so for example, if you have a 24 GB GPU, plus 32 GB of RAM, you will have a maximum of 24 + 16 = around 40 GB of addressable GPU memory:

1682424555464.png


This means that a 8 GB GPU such as the RTX 3070 on a PC with 32 GB of RAM actually has around 20 GB of addressable memory. However, at that point, the graphics subsystem is no longer interested in performance but rather, preventing crashes, as it's fighting for resources demanded by programs in main memory. By running games that reasonably demand a GPU that has that much dedicated memory to begin with, you can see where this is going fast.

I believe we may be seeing this symptom in Hardware Unboxed's video, in The Last of Us, where the computer is attempting to conserve memory at all costs:

1682424945638.png



This is, of course, my personal understanding of things. Don't take it as gospel, I might be talking rubbish - but one thing's for sure, by ensuring that I always have more RAM than an application demands, I have dodged that performance problem for many years now.
 
Last edited:
You know, it's amazing. People in this forum are jumping over themselves saying how the 3070 should have had 16GB at launch. Now, if nvidia had launched a 16gb 3070, with the associated $80+ price increase to cover the additional memory cost, the same people would REEEE about the price and how its too expensive and how nvidia is da big evil guy.

Allocation. :)Is. :)Not. :)Utilization.:)

We need that clapping hand emoji.

Unless you are seeing consistent microstuttering at higher framerates, you are not running out of VRAM. So far nobody has displayed that with TW3.
Certainly but it is an indicator for sure.
 
Me watching these videos:

Sure 8GB was stupid, my RX480 come with 8GB back in 2016
using a 3070, especially considering the cost, to game at 1080p is absurd
using a 3070 to run a game at 1440p at ultra is also absurd, it brings nothing to the experience, HU thinks the same as me, they even made a video about it
don't buy one, not because of the vram, but because of the price even if you play CSGO competitively
if you already have one, let HU make another video just so you fell even more buyers remorse, where were they when the 3070's released?!

and we keep beating these points around.
 
Whilst I agree it doesn't make sense to skimp on VRAM if you're paying +£600 for a new GPU intended for 2023-2025 AAA Ultra gaming, in reality many of us tweak settings anyway out of personal preference (I cannot stand Depth of Myopia, Chromatic Abhorration, "Supernova Bloom", stupefied head-bob, etc, effects) and think they look ridiculous far more than they have ever 'added realism'. And I'd still turn them off even if I owned a 65,536 Yottabyte VRAM GPU...
Absolutely I disable the same crap every single time :D

But textures must be maxed, same as LOD and all other things that affect real assets. The GPU hit is low but the IQ win is high.
 
Absolutely I disable the same crap every single time :D

But textures must be maxed, same as LOD and all other things that affect real assets. The GPU hit is low but the IQ win is high.
High textures in RE4R is demanding on vram though. My 3080 has crashed a few times till I reduced some settings at 1440p
 
Last edited:
A card with more vram will allocate more vram, and the usage numbers are not reliable, and is probably doing the same because they scale in the same way. You don't get any useful information this way.
Yes but if the game doesn't crash (which happens) then the quality of the game diminishes, sometimes very noticeably because the card is scrambling to allocate resources and has to compromise. If that's what you want then the 4070 is for you because that's the future - like it or not.

You know, it's amazing. People in this forum are jumping over themselves saying how the 3070 should have had 16GB at launch. Now, if nvidia had launched a 16gb 3070, with the associated $80+ price increase to cover the additional memory cost, the same people would REEEE about the price and how its too expensive and how nvidia is da big evil guy.

Allocation. :)Is. :)Not. :)Utilization.:)

We need that clapping hand emoji.

Unless you are seeing consistent microstuttering at higher framerates, you are not running out of VRAM. So far nobody has displayed that with TW3.
Let's wait for the stuttering then :D
 
Yes but if the game doesn't crash (which happens) then the quality of the game diminishes, sometimes very noticeably because the card is scrambling to allocate resources and has to compromise. If that's what you want then the 4070 is for you because that's the future - like it or not.

Not crash! but don't start or stutter, it's true, i'm not contesting that.
 
Back
Top