• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Dead Island 2: FSR 2.2

When FSR leads to 30 fps drop, smh.
 
When FSR leads to 30 fps drop, smh.

I find all of this new tech wonky and a waste of time and resources. As an industry, there should have never been any deviation from the standard native model and simply improving native through proper coding.

I look at some games from 15 years ago, and I am like why the fuck does this game look so amazing? Even AAA games today don't look that amazing when upscaled to 1440. I don't know, I think all this new tech is nonsense, I leave it all turned off and stick with native.

My point though, is that if those limited resources were not wasted on this tech, native would be better in every category.
 
TAA... first thing which is being disabled by me or replaced with normal SMAA, when launching any modern game.
 
I look at some games from 15 years ago, and I am like why the fuck does this game look so amazing? Even AAA games today don't look that amazing when upscaled to 1440. I don't know, I think all this new tech is nonsense, I leave it all turned off and stick with native.
What? The best games (goty or similar) from 2008 look like shit. Super mario galaxy? Wii fit? Left4dead? Little big planet might be an exception, and still looks excellent.

And ”non native” rendering is here to stay. There is no reason to render everything each frame from start to finish, when the previously calculated samples are easily available.
 
What? The best games (goty or similar) from 2008 look like shit. Super mario galaxy? Wii fit? Left4dead? Little big planet might be an exception, and still looks excellent.

Battlefield Bad Company looks fucking amazing when you play it at 1440p native high refresh on ultra settings. its smooth as butter too, no hitches or stuttering.

it was made in 2008.
 
Battlefield Bad Company looks fucking amazing when you play it at 1440p native high refresh on ultra settings. its smooth as butter too, no hitches or stuttering.

it was made in 2008.
Looks like shit. Finger on the trigger while re-loading. Come on. Even little big planet looks more realistic.

image_battlefield_bad_company_2-11839-1781_0001.jpg
 
Looks like shit. Finger on the trigger while re-loading. Come on. Even little big planet looks more realistic.

image_battlefield_bad_company_2-11839-1781_0001.jpg

that's not what in-game looks like...
 
that's not what in-game looks like...
It’s an official screenshot. In game looks probably worse.
Pure garbage:

maxresdefault-17.jpg

Just look at the city. Maybe 100x200 pixels of utter mess. Even bf1942 looks nicer, and not dull brown all over.
 
And ”non native” rendering is here to stay. There is no reason to render everything each frame from start to finish, when the previously calculated samples are easily available.
I sadly find myself more and more in agreement with this. These extended "guessed" frames and upscaled resolutions are actually generally pertinent with today's hardware.

Sure I'd rather see the brute force of our Almighty Chips just power through all requirements, but this is, for all the "purity" of it, stupid. It's as stupid as doing hardware raytracing in 2012.
We have software/algorithmical solutions that can relieve the pressure off the hardware, and we should use them. That doesn't stop us from growing the hardware, and the software will keep growing until it reaches a generally acceptable level. Even with their Advanced Micro Delays, AMD will push out FSR 2.3, 2.4, 2.5, etc, until we get to a mostly seamless implementation and everyone just uses a degree of upscaling since there's no point in brute forcing.
 
I sadly find myself more and more in agreement with this. These extended "guessed" frames and upscaled resolutions are actually generally pertinent with today's hardware.

Sure I'd rather see the brute force of our Almighty Chips just power through all requirements, but this is, for all the "purity" of it, stupid. It's as stupid as doing hardware raytracing in 2012.
We have software/algorithmical solutions that can relieve the pressure off the hardware, and we should use them. That doesn't stop us from growing the hardware, and the software will keep growing until it reaches a generally acceptable level. Even with their Advanced Micro Delays, AMD will push out FSR 2.3, 2.4, 2.5, etc, until we get to a mostly seamless implementation and everyone just uses a degree of upscaling since there's no point in brute forcing.
Yup, and it’s not just upscaling, but also oversampling if the picture is stable enough. A thing that many people dismiss completely, even though they yearn for the taste of the classic SSAA.
 
I'm curious, I've not seen it listed as FSR 2.2 anywhere else. Awesome if it is though, and thank you for the awesome work in taking a closer look.
 
It’s an official screenshot. In game looks probably worse.
Pure garbage:

maxresdefault-17.jpg

Just look at the city. Maybe 100x200 pixels of utter mess. Even bf1942 looks nicer, and not dull brown all over.
Your opinion is irrelevant, Bad Company 2 is the best.
 
Performance degradation at every resolution to use FSR, but FSR is always an improvement over native? I can't help but wonder what the hell went wrong with native, no sharpen pass? just inject one. What a strange situation to end up with less performance but better sharpening. and yeah @Dredi I'm with you on this one, it's here to stay and I for sure don't miss 'the old ways of rendering games' lol.
 
And ”non native” rendering is here to stay. There is no reason to render everything each frame from start to finish, when the previously calculated samples are easily available.

Of course there are reasons to render a frame from start to finish, you cannot always accurately reconstruct an image from previous frames, it's simply not possible. There are entire fields of study dedicated to researching the limits of sampling and reconstructing signals.
 
Because I had my game capped at 120 fps, both native and FSR were pushing 120 easy, so I did not notice any difference. But when I get back home I'm going to uncap the fps and see if there is indeed a performance drop (though I'm not looking forward to uncapping the fps, my GPU fans will try to launch up into the stratosphere).
 
Of course there are reasons to render a frame from start to finish, you cannot always accurately reconstruct an image from previous frames, it's simply not possible. There are entire fields of study dedicated to researching the limits of sampling and reconstructing signals.
If you read carefully, I wrote ”everything each frame”.

Of course you can try to state that it is possible to create a game where everything you see each frame has nothing to do with the earlier frame and thus sample re-use is impossible, but I would not want to play such a chaotic thing.
 
The first thing you might think of with these results is that it's just a CPU bottleneck, but that is not the case, or at least not completely.
Actually it is entirely CPU bottleneck cased, turning on FSR for whatever reason causes CPU performance to degrade by about 25% in this game. Something is broken.
 
Of course you can try to state that it is possible to create a game where everything you see each frame has nothing to do with the earlier frame and thus sample re-use is impossible, but I would not want to play such a chaotic thing.
That's not the case, you don't need to have a completely different frame in order for reconstruction to not work effectively, it's sufficient to go over a certain tolerance and the result will be inadequate.

Point being, reconstruction cannot completely replace the original image and if it does that means something is wrong with the native implementation, like in this case where TAA is horrible or there was never a need for such level of fidelity in first place and an imperfect image is enough.
 
Last edited:
When FSR leads to 30 fps drop, smh.
It looks better, so it makes sense to me, it's like TAA is running at 1/2 the resolution.
 
That's not the case, you don't need to have a completely different frame in order for reconstruction to not work effectively, it's sufficient to go over a certain tolerance and the result will be inadequate.
Depends on many variables, but I’d say that cases like what you describe are a very small subset of games. And even if you need to dump all existing samples sometimes, it’s effects on perceived fidelity are not that high, as successive unusable frames are very difficult to create in a playable game.


Point being, reconstruction cannot completely replace the original image and if it does that means something is wrong with the native implementation, like in this case where TAA is horrible or there was never a need for such level of fidelity in first place and an imperfect image is enough.
All you see is just reconstruction even in just basic TAA. It’s just slightly more dumb than with FSR2 etc. FSR2 just knows slightly better when some samples can be reused compared to basic TAA, and thus allows for lower sample count per frame for the same perceived fidelity.
 
What CPU was used in the tests? Technically there shouldn't be a CPU bottleneck at a high resolution, but bugs are getting more creative... lol
 
So I tested Dead Island 2 with and without FSR (on a 7900XT), and there is no performance drop with FSR. I get 153 fps with FSR Quality, 132 fps without. (Running at maxed out settings 3440x1440, Variable Shading ,Vsync and Motion blur off). My guess? FSR in this game doesn't work properly with nvidia cards. @techpowerup, please test with an AMD card (and maybe an Intel card) to stop the misinformation spreading. Maybe it's going to be fixed in patch, but why would AMD bother? It's not like nvidia are going to make DLSS available for anyone else, ever.
 
My guess? FSR in this game doesn't work properly with nvidia cards. @techpowerup, please test with an AMD card (and maybe an Intel card) to stop the misinformation spreading.
Your guess is wrong and there is no misinformation in the article that you comment. As long as you're not CPU limted, FSR boosts performance in Dead Island 2 both with AMD and NVIDIA GPUs. But the moment you hit CPU bottleneck, performance drops due to buggy FSR implementation.
 
Missed opportunity in the video, you should've had more comparison with super-zoomed vegetation for shimmering ,,, screenshots can't show shimmering
 
Back
Top