• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Silent Hill 2 Remake Performance Benchmark

I think the developers want to capture the authentic feeling of Japanese made games that run like crap on high-end PC, aka Elden Ring, Monster Hunter and many others :cool:
In the early days of SpecialK the dev commented how the devs of Tales of Zestiria, understood so little about PC gaming, they were out of desperation resorting to extremely short draw distances, FXAA, 30fps cap, then he released his custom driver that enabled flip mode and changed the CPU scheduling, suddenly the game ran better whilst using SGSSAA, mipmaps, higher resolution shadows, 60fps, and custom texture replacements.
 
I dont understand why people benchmark brand new games when these days there clearly full of bugs and are a unoptimised mess! Give it a few months when these are mostly sorted and the performance numbers will go up for sure as they usually do for both Nvidia and AMD.
 
I dont understand why people benchmark brand new games when these days there clearly full of bugs and are a unoptimised mess!
Because people are interested in it now, and not in the future. Source: Steam number of player charts
 
Yep, I just played for 15 minutes, game is eerily fun, feel very similar to Alan Wake 2.

I have never played Silent Hill series so this would be a fun game

I'm getting around 80FPS avg @ 4K RT + DLSS Balanced, more than playable, though sometimes there are noticeable FPS dips below 60
 
110 FPS at 1080p on $1600 GPU (RTX 4090)? Game definitely does not look better than some titles from 2018 or 2019.

Prices of games are raising. What is it now on average? $70-80 for standard edition of newly released title? GPU prices are raising. Mid-end GPU goes for $600-700? Yet, this is what we get. Performance is getting worse and worse. If you take a look at screenshots, this game definitely does not look to be so demanding as it is. It does not look better than Wukong or even 4 years old Cyberpunk.

What concerns me most is this trend to put everywhere DLSS/FSR and other fakeframes and distortion-generating stuff. Due to crap optimizations, devs encourage to turn on the upscaling to have better FPS (so does Ngreedia). In other words: Devs are lazy to optimize game so you can enjoy it at full blast on native, so, please, would you be so kind and enable fake stuff generation or else would you stfu? It's not okay that GPUs with such high amount of compute units, extreme power draw and prices beyond $700 perform so horrendously in newest games even at lower than 4K resolution. It should deliver at least 120 FPS on 1080p at this price point. This trend needs to end and devs should do their damn job - yes, optimizing is also part of their job.

Newest GPUs have so much performance, they should not be even using DLSS/FSR at first place. Instead, users should be encouraged to use 1.5x or 2.0x resolution scaling. Old games like Ghost Recon Wildlands or Far Cry 5 or even Borderlands 3 look so well and detailed even on 1080p with resolution scaling 2.0x (rendered in 4K and shrinked to 1080p). Instead, the trend is the opposite. Render in <1.0x resolution scale and distort the image by upscaling and guessing frames.

So, what comes next? RTX 5090 with 20k shaders and 600W power draw that won't be able to run newest titles beyond 60 FPS at 4K native?
 
AMD failed at this, how can 7900 xtx stay even behind 4070ti non super is beyond me. No wonder why AMD had to lower 7900 xtx's price to 4070 ti's level.
Meanwhile I have been saying AMD needs the help of ai to optimize drivers like Nvidia did years ago and get trolls getting triggered on Videocardz website. ‍:kookoo:
 
Unreal Engine for a long time has favored Nvidia over AMD. Even Unreal Engine 3 was announced and showcased in partnership with Nvidia.
 
Meanwhile I have been saying AMD needs the help of ai to optimize drivers like Nvidia
There is no AI optimizing drivers, it's a bunch of bs, that's not how this works.
 
Next gen cards will/should be better optimized for UE5, I've seen much older games looking better than some UE5 games tbh. Games are getting more demanding to run to push hardware sales, been like that forever, what a total shocker.
 
There is no AI optimizing drivers, it's a bunch of bs, that's not how this works.
I guess this is not a thing?
 
TBF if you set everything to epic it runs like ass on most GPUs, I've got an rx6800 and with a mixture of settings to keep most eye candy looking good I get 70fps outside and 80-90 inside, also playing 1080p helps, but I rarely get any AAA game and literally turn everything to epic/ultra, the majority of people don't either and will tweak settings to a level of performance and graphics that is acceptable for them
 
his is the worst optimized game ever.
Doesn't beat GTA IV.

I dont understand why people benchmark brand new games when these days there clearly full of bugs and are a unoptimised mess!
When you only benchmark games in "optimised" conditions then it might seem the game devs are actually doing their jobs. No. That's utterly deceiving and it's blatant enabling.

It's already enough people pre-ordering games in spite of circumstances because they're dopamine junkies, we don't need to create even more conditions for devs' laziness. They wouldn't be like that if millions of people, instead of buying, were trashtalking them.

Since W1zzard benchmarked it and called it out for being unplayable on most equipment some people will postpone the purchase until they upgrade or the game gets polished. If he were to be even more blunt and direct with conclusions, also murdering the optimisation with words on some popular YT channel, it would've been another nice little wake-up call.

I don't enjoy being milked and I vote "no" with my wallet.
run newest titles beyond 60 FPS at 4K native?
That has never been the case FYI. Before 3090, gaming 4K native was only available with lowering settings, a lot of them. Games of 2017 ran like crap at 4K on 1080 Ti. I don't remember the exact title but it's also been a game that even stuttered at 1440p.
4K, as of now, is purely upscaling area if we talk latest games. Almost zero reason to run them native because upscalers matured and provide very little quality loss (zero, sometimes) for what performance boost they allow. Unlike 1440p and especially 1080p where upscaling artifacts are still very noticeable.
 
Looking and taking in consideration the upscaling article also, it looks like a half baked beta version that is coded on Nvidia hardware, but even there is fails to offer the DLSS quality that Nvidia users expect.
 
Next gen cards will/should be better optimized for UE5, I've seen much older games looking better than some UE5 games tbh.

Next gen cards will only be optimized for UE5 in the sense that they will have more RT performance. Otherwise I don't see the value in any hardware changes that don't also benefit other games / engines.

Games are getting more demanding to run to push hardware sales, been like that forever, what a total shocker.

There are three factors that make the current situation unique:

1) The price of the entire GPU stack has increased. You used to be able to get 76% of flagship performance in cards like the 970 for $330 and now you get 50% of flagship performance for $550. You are both paying more (price increases vastly outpace inflation) and getting less, a lot less. Forget about the flagship card, that has gone from $700 to $1,600.

2) Game hardware demands have skyrocketed, not just increased at their typical pace. When your game requires a $550 GPU just to hit 60 fps at 1080p before even enabling RT in the settings, that's a problem. Heck you can spend $1,600+ on a 4090 and still be forced to lower the settings to run 4K or even 1440p at a higher refresh rate. The worst part in a lot of these games with RT disabled look worse than games that were designed first and foremost with rasterization based lighting in mind.

3) The vast majority of people cannot enjoy these new features. Too many games essentially require a 4080 or a 4090 to have all the eye candy turned on and even in those instances they sometimes still requires DLSS to get a playable frame-rate. Forget about anything below a 4090 being capable of doing RT after the 5000 series launches and games start targeting that higher bar.

The combination of large jumps in hardware demands in addition to the ever increasing prices of GPUs has understandably made gamers very vocal about this issue.
 
This explains why UE5 is so bad in perf for games.

This could really do with some more exposure, damn, that rabbit hole goes deep. His other stuff about upscaling, too. Wow.
 
Wow nice find. Very well explained video.

This puts the finger on so many things I've been assuming were happening, and recognizing in newer games, I've also mentioned a lot of it in topics on TPU here even before knowing this, as have many others. We're being taken for fools and there's an industry game being played. I'm not the conspiracy theory fan at all; this, this is real and the video also shows how Nvidia is ready to take advantage. The artifacts and graphical issues shown also relate to SO MANY people posting topics about graphical anomalies on various games/cards over the last years. Crazy.

Might be deserving of its own topic, but also relates strongly to the performance and graphical results we see in SH2. Also, tell me again 'games are too expensive to make'... if you see what the money and dev time's actually spent on.

___

Epic, no RT, Xess ultra quality, 60 fps locked, about 400W system power. Pretty unimpressed overall looking at it myself now. With RT, similarly... But its not bad. The performance though, just lol

Without Xess and running native no aa its a total mess and I get about 54 FPS; foliage and hair for example is just horrible to look at. Gonna try FSR3, but overall, I've found myself using Xess.

SHProto-Win64-Shipping_2024_10_10_20_47_37_392.jpg
 
Last edited:
The visual fidelity gain at max settings does not look to be worth the performance hit for this title
 
This game looks SHOCKINGLY bad for the performance its putting up. Honestly thought it was a remaster of the original rather than a completely rebuilt game using UE5. It reminds me of the OG Alan Wake that I played through recently (with some nicer shader effects) than something developed on a cutting edge engine.
absolutely insane performance numbers for the visuals presented, just an all around really bad job....but then it is blooper team after all
Imagine showing this to someone from 10 years, graphics barely any better and performance is in the gutter.
The worst part in a lot of these games with RT disabled look worse than games that were designed first and foremost with rasterization based lighting in mind.
super mario remake.png
 
The visual fidelity gain at max settings does not look to be worth the performance hit for this title
That's the same for 99% of ALL modern AAA games, high/medium settings and tweaking will give you 95% of the same visuals as the GPU destroying Epic/Ultra settings, there is a corner of gamers who are snobs and think that they should be able to run every game at 4K/RT/Ultra settings at 200+ FPS, further exasperated by the introduction of ultra high Hz monitors and CS/FN kiddies who think that running 300+ FPS makes them more 1337 than everyone else, obviously never had to deal with trying to hit 30FPS at 360p back in the day or your games not running at all cause you didn't meet the HW req's
 
Last edited by a moderator:
it's a mental illness IMO
It could be that but games IN FACT become worse from the visuals standpoint. Yes, textures are much higher definition but TAA, Nanite, Lumen etc shenanigans only make games putrid vomit and optical migraine running at abysmal framerates. People like you are enabling this cancer.
Gamers should be much more vocal about rubbish state of modern gaming. With equipment like, say, 4070 it's nonsensical to "run" games with visuals worse than in GTA V at 1080p at 50 FPS. There is no excuse for that. UE5 must be abandoned and UE6 must be a completely new product written from scratch that contains zilch cancer introduced in latest years. But alas.
 
Back
Top