• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Remnant II Benchmark Test & Performance Analysis

I'm not sure why so many people are down on the graphics aside from some of the larger pieces looking generic (probably procedural generation tiles at work) the details within the world are fantastic. Perhaps they could have done a better job designing the procedural tiles, but overall I can at least see why it uses hardware heavily. Hopefully they'll be able to do some real optimization, but I'm not holding my breath.
... The DLSS uplift is abnormally large though...
This is what I noticed right away when I disabled DLSS saw the frame rate then put it on balanced. I've never seen another game that gains ~75% performance just going to DLSS balanced from off.
 
As said by others, it doesn't look bad visually at all to me and high end graphic cards are not future proof.
 
I'm not sure why so many people are down on the graphics aside from some of the larger pieces looking generic (probably procedural generation tiles at work) the details within the world are fantastic. Perhaps they could have done a better job designing the procedural tiles, but overall I can at least see why it uses hardware heavily. Hopefully they'll be able to do some real optimization, but I'm not holding my breath.

This is what I noticed right away when I disabled DLSS saw the frame rate then put it on balanced. I've never seen another game that gains ~75% performance just going to DLSS balanced from off.

I watched a video of a 4090 almost doubling framerate in the quality mode. Typically games see a 30-40% gain at best. its interesting.... I didn't care for the first one so I probably won't pick this up.
 
Unreal Engine games never disappoint with poor performance from any studio save Epic Games themselves. UE4 games don't perform nearly as poorly anymore simply because graphics cards have gotten multiple times better since it's release.
You have to give credit to "The Coalition" probably the best UE dev out there.
 
80D9C6DE-9759-4A30-94B6-9EB4C8A16B7B.png

Looks like real bandwidth and core count matters. Only the 4090 maintains its usual leading position and even a 7900XT moves past the 4080. But in Minimums, its a green bloodbath to be honest

Welcome to the future...
 
The game is not in my playlist- I just wanted to see next gen graphics but it’s disappointing yet again.

We may have to wait for Alan Wake 2 to make use our shiny and painfully expensive gpus.

Radeons are fast as expected on games without RT. I think devs tried really hard to keep the vram usage under 8ish GB.
 
The game looks very good during gameplay, but this is not an AAA title so using UE5 does not automatically mean their assets have the quality of a God of War game or sth.
It does however mean that the GPU load, which using Nanite comes with will be felt in full either way.
Couple of recommendations:
- Lowering shadows massively improves performance in this title without necessarily compromising the quality of the visuals. With shadows on medium you can keep everything else on ultra and get a decent enough framerate.
- The game requires very fast reactions, very often. Using DLSS3 would be ill-advised. If you're on nvidia stick to DLSS2.
 
To an extent. Architecture is more important. Otherwise, the 5700XT should have a commanding lead over the 6600XT, and the 3080 should have a sizeable lead over the 4070.
Its going to be very interesting to see more UE5 content going forward... if this persists across games those RDNA2/3 cards are looking mighty good and Nvidia has a problem. Its staggering to see Ada lose more than a full tier of positioning - even on the 4090 in fact.

Another insight on why they sell DLSS so heavily. Youll need it... thats not a great development for a proprietary tech. At the same time though the fps gained is similarly staggering there. Its a crutch, but a good one.
 
Last edited:
Its going to be very interesting to see more UE5 content going forward... if this persists across games those RDNA2/3 cards are looking mighty good and Nvidia has a problem. Its staggering to see Ada lose more than a full tier of positioning - even on the 4090 in fact.

Another insight on why they sell DLSS so heavily. Youll need it... thats not a great development for a proprietary tech. At the same time though the fps gained is similarly staggering there. Its a crutch, but a good one.
The same applies for FSR2 btw. I'm playing on the settings I described above at 1440p with "quality" upscaling and fps hovers mostly around 75-80 fps.
 
Damn, I'm beginning to think even the latest and greatest cards with lots of VRAM aren't "future proof".

Fools and money.
They never are. Futureproofing has always been a "best guess" kind of thing. You can guess a little better if you inform yourself, but you're still guessing.
 
Last edited:
What a let down on the performance, i am still going to buy it since the family wants to play it. Ill have to use DLSS just to get any decent frame rate at 4k
I haven't seen yet a game that deserves to be played at 4K. I would use the DLSS without bad feelings.
 
They never are. Futureproofing has always been a "best guess" kind of thing/ You can guess a little better if you inform yourself, but you're still guessing.

Resale price should be considered before future proofing, but rarely talked about ;)
 
Resale price should be considered before future proofing, but rarely talked about ;)
That also falls into the "best guess" category. It depends on how the next-gen looks like. And we never know that.
 
Not that I'm defending poor optimization or anything, but who in their right mind plays on max settings anyway? Most of the time, the difference between High and Ultra is barely perceptible on stills, let alone while you're blasting away at a horde of cockney elves. Meanwhile, the framerate boost from switching down to High is considerable.

Max is for screenshots, High is for actual gameplay.
 
Not that I'm defending poor optimization or anything, but who in their right mind plays on max settings anyway? Most of the time, the difference between High and Ultra is barely perceptible on stills, let alone while you're blasting away at a horde of cockney elves. Meanwhile, the framerate boost from switching down to High is considerable.

Max is for screenshots, High is for actual gameplay.
Not disagreeing with that. I've always bought mid-range cards and have been happy to fiddle with settings. Like you said, a lot of performance can be gained with minimal IQ sacrifice. But that's me and my ~$250 cards. Idk if someone who spends $500, or lately over $1,000, on a video card is happy to find out they need to tone down stuff to play at 60fps.
Sure, some games have over-the-top settings that are meant to be more of a tech-demo and not really expected to be turned on. But that's definitely not the case here.
 
It's not an UE5 problem. Porting the same scene from UE4 to UE5 without any tweaks hands you a nice 20-30% boost in performance and the scene looks better on top. Epic did some amazing things with this engine; it's a shame to see developers still not utilising it to its full potential. Only Epic Games and The Coalition do Unreal Engine justice (I wish I could say the same for Respawn, but they keep dropping the ball).
 
Yet another 2023 game which looks don't seem to warrant the requirements.
 
And to think some people where disappointed it doesn't use Lumen too, SLI two RTX 4090 pushing well over 1000W and you could see playable framerates at 1440P.
Hehe, the stupidity of things we have reached reached, masks, booster shots, mining, alien sightings, ultra expensive GPU's, it's getting really hard to be in line with the new normal.
 
The performance is atleast the only downside to the game. It's unfortunate it doesn't run better but it's far from unplayable atleast.

The game itself is super enjoyable, good boss fights and enemy design.
 
The performance is atleast the only downside to the game. It's unfortunate it doesn't run better but it's far from unplayable atleast.

The game itself is super enjoyable, good boss fights and enemy design.
That's fair. I'd take a good game that needs toning down some details over a flashy title that's basically FarCry2023.
 
And to think some people where disappointed it doesn't use Lumen too, SLI two RTX 4090 pushing well over 1000W and you could see playable framerates at 1440P.
Hehe, the stupidity of things we have reached reached, masks, booster shots, mining, alien sightings, ultra expensive GPU's, it's getting really hard to be in line with the new normal.
4090 cant sli
 
This is what I noticed right away when I disabled DLSS saw the frame rate then put it on balanced. I've never seen another game that gains ~75% performance just going to DLSS balanced from off.
I was thinking it odd that 'Balanced' was not incl.
->
Remnant II is one of the first games that has "DLAA" as additional DLSS quality profile. The complete list of DLSS settings is: Ultra Performance, Performance, Quality and DLAA

Also some games get that uplift even using 'Quality'. Metro EE for eg.
 
Back
Top