• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Resident Evil 4 Benchmark Test & Performance Analysis

ResidentEvil4RT.JPG


Wuut, something wrong here. A 999$ AMD card can't beat a 1,199$ Nvidia card, in R4YTR4C1NG!

Right, RIGHT?!? :wtf:
 
The fact that most cards out these don't have anything more than 8GB of VRAM, this game is a fail when it comes to optimization. So they want people to start running FSR for 1080p even for a card like the RTX 3070 Ti. While I agree that Nvidia created this problem by limiting the amount of VRAM, but throwing more VRAM is not going to solve the problem because developers can optimize the game to fit within the VRAM limit, but they chose not to do it. So even with 12 or 16GB, that root cause of poor optimization does not go away.
I dont know why more people dont understand this. It doesnt matter how good your hardware is if devs dont bother optimizing for it, and loading cards with tons of VRAM so devs dont have to optimize is a great way to jack up the price of your cards.
 
The fact that most cards out these don't have anything more than 8GB of VRAM, this game is a fail when it comes to optimization. So they want people to start running FSR for 1080p even for a card like the RTX 3070 Ti. While I agree that Nvidia created this problem by limiting the amount of VRAM, but throwing more VRAM is not going to solve the problem because developers can optimize the game to fit within the VRAM limit, but they chose not to do it. So even with 12 or 16GB, that root cause of poor optimization does not go away.

These next gen remakes are tuned to fit within the 16GB of memory available on consoles. 12GB of which is often dedicated to the GPU. On top of that the consoles tend to use lower quality textures which does save some space but if you are spending $800 on a 12GB 4070Ti or if you spent $600 on a 3070Ti you would expect higher than console level textures because if you were happy with console quality why not just spend $500 on a PS5 or Xbox Series X.
 
View attachment 289804

Wuut, something wrong here. A 999$ AMD card can't beat a 1,199$ Nvidia card, in R4YTR4C1NG!

Right, RIGHT?!? :wtf:

Yeah, in a game where you can't even tell RT is turned on Radeon does ok...... Who would have thought.
 
Just lower some setting and you'll be fine
The first sensible answer.

As if anyone would stubbornly cling to the maximum texture setting and not lower it to play, at which point the 3070 gets higher FPS (and extrapolate that it would in the failed tests with a sensible texture setting). In fact I'm pretty sure there's no difference in texture visuals till you set it < 3GB.
 
The first sensible answer.

As if anyone would stubbornly cling to the maximum texture setting and not lower it to play, at which point the 3070 gets higher FPS (and extrapolate that it would in the failed tests with a sensible texture setting). In fact I'm pretty sure there's no difference in texture visuals till you set it < 3GB.

DF has a pretty useful guide for people with 8/10GB graphic cards.


My argument is people spending 500+ on gpu shouldn't have to be worrying about vram one generation later but also this is nothing new from Nvidia so I guess gamers who bought any if their mid tier cards deserve this.
 
My argument is people spending 500+ on gpu shouldn't have to be worrying about vram one generation later but also this is nothing new from Nvidia so I guess gamers who bought any if their mid tier cards deserve this.
I suppose it's up to the buyer what they would expect. I'm not sure 1 generation after buying a mid/upper-mid gfx card that I'd be pissed I can't max out every setting in AAA every game that releases as it's not the smartest choice of settings anyway, and oftentimes the texture setting does bugger all outside of allocation. Virtually everyone compromises somewhere. And not only are these situations rare, but often unrealistic and compounded by certain games that release half baked. I can see it increasing as time marches on because of course that's the only way this will trend, but it saddens me to see so many schadenfreude laden reactions. To some, the most important part is relishing in every short lived burst of negative attention Nvidia gets.

What I'd do with a 3070? lower textures as per the optimized settings, drop in the DLSS mod and enjoy.
 
I suppose it's up to the buyer what they would expect. I'm not sure 1 generation after buying a mid/upper-mid gfx card that I'd be pissed I can't max out every setting in AAA every game that releases as it's not the smartest choice of settings anyway, and oftentimes the texture setting does bugger all outside of allocation. Virtually everyone compromises somewhere. And not only are these situations rare, but often unrealistic and compounded by certain games that release half baked. I can see it increasing as time marches on because of course that's the only way this will trend, but it saddens me to see so many schadenfreude laden reactions. To some, the most important part is relishing in every short lived burst of negative attention Nvidia gets.

If Nvidia didn't have a history of giving cards that would perform fine otherwise inadequate memory I don't think I'd be so hard on them but this has been a thing at least since the GTX 680/670. The fact that a 6800XT will be a lot better in a lot of games going forward vs a 3080 10G just because Nvidia chose to cheap out on memory is bad look. Same with the 6700XT vs the 3070. Nvidia isn't stupid they do this to force people to upgrade sooner than they would need to otherwise and given what they are wanting to charge especially with any generation going forward we should expect better.
 
chances are, this game will get a patch that will fix the crashing due to lower VRAM. I used to play quite a few games on a 1070 or even 3070 that would eat up more vram than the GPU has but wouldn't crash or even drop in performance too much. Did this with RE2 Remake for example. I am assuming it has something to do how it uses the VRAM. Anyway, lets see what Capcom does.

FSR 2 use also looks like crap apparently.
 
If Nvidia didn't have a history of giving cards that would perform fine otherwise inadequate memory I don't think I'd be so hard on them but this has been a thing at least since the GTX 680/670. The fact that a 6800XT will be a lot better in a lot of games going forward vs a 3080 10G just because Nvidia chose to cheap out on memory is bad look. Same with the 6700XT vs the 3070. Nvidia isn't stupid they do this to force people to upgrade sooner than they would need to otherwise and given what they are wanting to charge especially with any generation going forward we should expect better.
See, I remember when we were hammering the 3080 for its 10g VRAM buffer and being told the 16GB 6800 series was totally unnecessary.

LOL. LMFAO even.
 
See, I remember when we were hammering the 3080 for its 10g VRAM buffer and being told the 16GB 6800 series was totally unnecessary.

LOL. LMFAO even.

I haven't been a huge fan of RDNA2 or RDNA3 but I do appreciate that AMD gives their cards adequate vram even accounting for edge cases.

At launch MSRP vs MSRP I likely would have bought the 3080 10G as well but giving the 3070/3070ti 8GB was at best shortsighted at worst just another way to make gamers upgrade sooner than they would have to otherwise.

The fact that they chose to cut down an 800 usd+ card like the 4070ti the way they did to the point it loses a lot of gas at 4k makes me think it's the latter.
 
Not sure, just tried the Demo... runs on 1080ti 2k60FPS. VRAM usage is as W1z wrote in review.

1680217116313.png
 
I suppose it's up to the buyer what they would expect. I'm not sure 1 generation after buying a mid/upper-mid gfx card that I'd be pissed I can't max out every setting in AAA every game that releases as it's not the smartest choice of settings anyway, and oftentimes the texture setting does bugger all outside of allocation. Virtually everyone compromises somewhere. And not only are these situations rare, but often unrealistic and compounded by certain games that release half baked. I can see it increasing as time marches on because of course that's the only way this will trend, but it saddens me to see so many schadenfreude laden reactions. To some, the most important part is relishing in every short lived burst of negative attention Nvidia gets.

What I'd do with a 3070? lower textures as per the optimized settings, drop in the DLSS mod and enjoy.

It is one thing to lower compute heavy effects to keep frame rates where you want them. Dropping texture quality on a $500 2 year old GPU already is a bit of a joke, especially when the cheaper 3060 12GB can offer comparable or better performance.
 
It is one thing to lower compute heavy effects to keep frame rates where you want them. Dropping texture quality on a $500 2 year old GPU already is a bit of a joke
I put both of those in the same bucket, all are settings that enhance or affect visual quality, and when most games have little to no difference between the upper texture settings, yeah I treat them the same.
Nvidia isn't stupid they do this to force people to upgrade sooner than they would need to
lol force? steady on there mate, nobody is forcing anyone to do anything.
 
It's ok to turn down textures on a $600 MSRP card less than 2 years after it's launch?

3070 Ti launched in June of 2021 for $600. This is not old hat.

Here's the context. The $480 6700 XT can run the game without compromise. The $580 6800 can too. These cards were launched for cheaper, and earlier. So I reject that a 21 month old card (as of RE4 launch) having to lower settings is normal by any standard when it's cheaper and older priced competitors don't have to. I'd also say it's not just an example of the competition being better. Nvidia 1070 Ti didn't have VRAM issues less than 2 years after launch, for example.

The game should be optimized better, undoubtedly. But it also wouldn't be a problem if Nvidia didn't gimp the VRAM which oh yes they most certainly did. I have a 3090 for what it's worth, I buy what suits my needs and also recommend what suits other's. I have very rarely recommend the 3070 series for the VRAM reason (the exception was Cyberpunk where frankly it stomps on AMD) and it should have been obvious to any consumer that by2023 or 2024 at the latest, 8GB would show it's clear and obvious weakness.
 
"Another result worth pointing out is that cards with 8 GB VRAM will be unable to run ray tracing at all"

bUt mY rTx 3070 hAs pLeNtY oF mEmOrY!!!!

"no game needs more than 8/10GB!!!"

3070/3080s and now 4070s don't have enough memory and whoever bought/buys them will regret very soon, you can thank nVidia for being greedy and yourself for buying such a product
Every advocate who defended RTX 3070 / RTX 3070 Ti's 8 GB VRAM being enough is wrong.

By design, NVIDIA made sure that any PC gamers who purchase mid-upper SKUs will need to purchase another GPU product within a game console generation. It's a trickle upgrade regime.

My secondary living room gaming PC has the unfortunate MSI Suprim X RTX 3070 Ti 8 GB and I plan to upgrade it with RX 7900 XT 20 GB. RX 7900 XT 20 GB would be my 1st AMD discrete GPU since Radeon R9-290X.

My primary gaming PC has Gigabyte RTX 4080 16 GB Gaming OC.

Tried the demo on my 3060Ti, the game kept crashing from D3D error in the menu whenever I got close to 7GB VRAM used. Finally managed to get in game with setting showing ~6GB usage. Crashed again during the bingo cutscene. Never even attempted to enable RT, so that's no the issue here.

Capcom messed up something, the game needs a patch, not a better GPU.
Your GPU is not at PS5's VRAM standards. Upgrade your GPU.
 
Last edited:
These next gen remakes are tuned to fit within the 16GB of memory available on consoles. 12GB of which is often dedicated to the GPU. On top of that the consoles tend to use lower quality textures which does save some space but if you are spending $800 on a 12GB 4070Ti or if you spent $600 on a 3070Ti you would expect higher than console level textures because if you were happy with console quality why not just spend $500 on a PS5 or Xbox Series X.

Most peoples PCs perform worse than consoles and that's always been the case. The reality is that most gaming PCs are running 1080p 60hz, 16 gb RAM, a six core CPU, and an nvidia x060 series GPU and will get utterly curb stomped by a console. They are also not really playing AAA games either. PC gaming is largely stuff like DOTA or other things that run just fine on a potato PC and the draw of the PC is not a superior experience (again it will get crushed by a console) but a mix of steam sales, piracy, and the ability to cheat like crazy. Far from being the master race in most cases PC is a cheaper yet inferior gaming experience but you can cheat in game and steal games at will. That is the reality of PC gaming, always has been.

Of course there are edge cases like people here where 4090s and 120hz 4k or 240hz 1080p are common but that's a remote fraction of the PC gaming community. Most 4k high resolution gaming is happening on consoles and most low detail 1080p gaming is happening on the PC.

Capcom may have dropped the ball on optomization true. But the majority of PC gamers are running PCs that get slaughtered by consoles and the majority of people who care about high resolution and image quality are going to be playing this on a console always. It's typically a waste of time to worry about high end PCs even remotely, the amount of them is miniscule and they are such edge cases they deserve very little if any attention. I say this is someone with a stupidly high end PC.
 
Most peoples PCs perform worse than consoles and that's always been the case. The reality is that most gaming PCs are running 1080p 60hz, 16 gb RAM, a six core CPU, and an nvidia x060 series GPU and will get utterly curb stomped by a console. They are also not really playing AAA games either. PC gaming is largely stuff like DOTA or other things that run just fine on a potato PC and the draw of the PC is not a superior experience (again it will get crushed by a console) but a mix of steam sales, piracy, and the ability to cheat like crazy. Far from being the master race in most cases PC is a cheaper yet inferior gaming experience but you can cheat in game and steal games at will. That is the reality of PC gaming, always has been.

Of course there are edge cases like people here where 4090s and 120hz 4k or 240hz 1080p are common but that's a remote fraction of the PC gaming community. Most 4k high resolution gaming is happening on consoles and most low detail 1080p gaming is happening on the PC.

Capcom may have dropped the ball on optomization true. But the majority of PC gamers are running PCs that get slaughtered by consoles and the majority of people who care about high resolution and image quality are going to be playing this on a console always. It's typically a waste of time to worry about high end PCs even remotely, the amount of them is miniscule and they are such edge cases they deserve very little if any attention. I say this is someone with a stupidly high end PC.
GA104-based RTX 3070 has the compute (+20.31 TFLOPS FP32) and ROPS (96 ROPS) power to beat either XSX or PS5, but it's gimped by 8 GB VRAM.

GA104-based professional RTX A4000 has 16 GB VRAM.

For full-size L3 cache versions, note that 6 cores Zen 3 are roughly equivalent to 8 cores Zen 2. PS5's Zen 2 is half the L3 cache version from Zen 2 APU.

PS5's recycled AMD 4700S SKU shows the CPU being slower than Ryzen 7 4750G (Zen 2 half L3 cache with full quad pipeline AVX-2 hardware), Ryzen 7 3700X, and Ryzen 5 5600X.

AMD_4700S_vs_i7_9700.jpg


From https://dicaspc.com.br/reviews/4700s/

4700S-octane.png


4700S-x265.png
4700S-r20.png


By premature obsolescence road map, NVIDIA made sure that any RTX 3070 and 3070 Ti (8GB VRAM) PC gamers will need to upgrade their GPUs within a DX12U era game console generation.

In terms of VRAM and product availability, AMD's RX 6700 XT 12 GB and RX 6800 16 GB have a closer PS5 and XSX equivalency. I should have purchased an AMD competitor against RTX 3070 Ti for my secondary living room gaming PC.
 
Last edited:
So buy a 4090 or get thine ass on a console which has always been the case with the PC. Go big or go to console. Did you buy a 4090, if not than go to a console. We aren't even touching how consoles allow more direct to the metal optomization or more optomizations in general. Again, do you have the top of the top? If not you are not the master race you are paying for crap and a crap experience so the upper 10-5% of PC gamers can dunk on a system that is better than yours.

Nvidia did do this, you did it! And you cheered it! And now here you are with exactly what you kept asking for and voting for with your money and you are not happy with it. You own all of it. And there you are ass out like a fool with an inferior system that's never optomized but does has RGB! Also tons of cheating and piracy. That is who you are and what your community is.
 
So buy a 4090 or get thine ass on a console which has always been the case with the PC. Go big or go to console. Did you buy a 4090, if not than go to a console. We aren't even touching how consoles allow more direct to the metal optomization or more optomizations in general. Again, do you have the top of the top? If not you are not the master race you are paying for crap and a crap experience so the upper 10-5% of PC gamers can dunk on a system that is better than yours.

Nvidia did do this, you did it! And you cheered it! And now here you are with exactly what you kept asking for and voting for with your money and you are not happy with it. You own all of it. And there you are ass out like a fool with an inferior system that's never optomized but does has RGB! Also tons of cheating and piracy. That is who you are and what your community is.
FYI, a PC gamer with RX 6800 / RX 6800 XT 16 GB has fine wine.

It's ok to turn down textures on a $600 MSRP card less than 2 years after it's launch?

3070 Ti launched in June of 2021 for $600. This is not old hat.

Here's the context. The $480 6700 XT can run the game without compromise. The $580 6800 can too. These cards were launched for cheaper, and earlier. So I reject that a 21 month old card (as of RE4 launch) having to lower settings is normal by any standard when it's cheaper and older priced competitors don't have to. I'd also say it's not just an example of the competition being better. Nvidia 1070 Ti didn't have VRAM issues less than 2 years after launch, for example.

The game should be optimized better, undoubtedly. But it also wouldn't be a problem if Nvidia didn't gimp the VRAM which oh yes they most certainly did. I have a 3090 for what it's worth, I buy what suits my needs and also recommend what suits other's. I have very rarely recommend the 3070 series for the VRAM reason (the exception was Cyberpunk where frankly it stomps on AMD) and it should have been obvious to any consumer that by2023 or 2024 at the latest, 8GB would show it's clear and obvious weakness.
"8 GB VRAM is enough" advocacy/shills for RTX 3070 and RTX 3070 Ti were strong.
 
FYI, a PC gamer with RX 6800 / RX 6800 XT 16 GB has fine wine.


"8 GB VRAM is enough" advocacy/shills for RTX 3070 and RTX 3070 Ti were strong.

The issue you have is the amount of PC gamers who have that is minimal, next to nothing, they don't count. Steam Survey puts 3060, 2060, and 1060 as the most common out there. There are more GTX 1650 and 1050 out there than RX 6800, it might as well not exist.

And that's the catch. Most PC Gamers are running a PC that is outright pathetic compared to a console. It is a vastly inferior experience. PC is not the master but it's the 1080p low details and low frames solution unless you are playing crap like League of Legends. This is widely known and laughed about outside of a tiny minority of PC Master Race fools who keep getting upset that they don't get priority, which just adds to the laugh. Again I say this as someone with a bonkers PC (4090, i9, 64gb).

So what you have to realize is that when developers think PC they don't think better, they think the shit platform. They are not optomizing for the high end they are optomizing for low 1080p 60 because that is the reality of the PC as a platform. And no amount of anger at nvidia, complaining about optomization, is going to change the fact that when it comes to gaming, the PC is on the bottom of the shit pile.
 
The issue you have is the amount of PC gamers who have that is minimal, next to nothing, they don't count. Steam Survey puts 3060, 2060, and 1060 as the most common out there. There are more GTX 1650 and 1050 out there than RX 6800, it might as well not exist.

And that's the catch. Most PC Gamers are running a PC that is outright pathetic compared to a console. It is a vastly inferior experience. PC is not the master but it's the 1080p low details and low frames solution unless you are playing crap like League of Legends. This is widely known and laughed about outside of a tiny minority of PC Master Race fools who keep getting upset that they don't get priority, which just adds to the laugh. Again I say this as someone with a bonkers PC (4090, i9, 64gb).

So what you have to realize is that when developers think PC they don't think better, they think the shit platform. They are not optomizing for the high end they are optomizing for low 1080p 60 because that is the reality of the PC as a platform. And no amount of anger at nvidia, complaining about optomization, is going to change the fact that when it comes to gaming, the PC is on the bottom of the shit pile.
Did you forget when Crysis was released in 2007? What was the common GPU in 2007?

The install base for PS4 is more than twice as large when compared to PS5, but that didn't stop improvements with artwork textures diversity and increased texture resolution when PS5 is the current baseline hardware spec.

Your optimizing argument is a code word for degradation.
----
For the record, I have two AM5-based gaming PCs i.e, RTX 4090/ 7950X/ASUS X670E Hero/32 GB DDR5-6000 (refer to screenshot), and the other gaming PC is on my system specs.

Tune 5.png

I use this PC for work. I sold my two RTX 3080 Ti GPUs to help fund RTX 4080 and RTX 4090 purchases.

My RTX 3070 Ti is on my living room gaming PC and I plan to change it to RX 7900 XT 20 GB.
 
Last edited:
"Another result worth pointing out is that cards with 8 GB VRAM will be unable to run ray tracing at all"

bUt mY rTx 3070 hAs pLeNtY oF mEmOrY!!!!

"no game needs more than 8/10GB!!!"

3070/3080s and now 4070s don't have enough memory and whoever bought/buys them will regret very soon, you can thank nVidia for being greedy and yourself for buying such a product
I believe RTX 4070 will have 12GB which seems to be minimum for 4K combined with RT nowadays.
 
I believe RTX 4070 will have 12GB which seems to be minimum for 4K combined with RT nowadays.
Under Windows 10, the operating system and non-gaming apps have dedicated GPU memory usage.

From https://devblogs.microsoft.com/directx/demystifying-full-screen-optimizations/

Legacy Fullscreen Exclusive (FSE) is disabled by default. Windows 10's "Fullscreen Optimizations" mode is like a borderless windowed mode that still maintains active desktop composition functions.

Legacy Fullscreen Exclusive (FSE) mode gives your game complete ownership of the display and allocation of resources of your graphics card.
 
Back
Top