• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Deathloop Benchmark Test & Performance

"Needs 10GB VRAM for 900p". LOL. Every game Arkane have made with the Void Engine (Dishonored 2, Deathloop) has launched to complaints about performance. And every other highly rated well optimised games Arkane have made (Dishonored (Unreal 3), Prey (CryEngine 4), etc), have all been made on something not-Void. Should have just stuck with Unreal.

It does not need 10GB

8GB RTX 3070 beats 6700XT 12GB at 4K

and 8GB RTX 2080 beats 12GB RTX 3060

This sure looks like a terrible port, but it seems that it's nothing unusual for Arkane (not that I would know personally as I buy games months after release).

For how it looks, it should be running at at least twice the FPS it does now, not to mention that a lot of people are suffering from constant stuttering on top of all that.

Digital foundary has optimized setting guide for PC

Also, RTX 2070 Super performs better than PS5 version when using same settings.
 
It does not need 10GB

8GB RTX 3070 beats 6700XT 12GB at 4K

and 8GB RTX 2080 beats 12GB RTX 3060



Digital foundary has optimized setting guide for PC

Also, RTX 2070 Super performs better than PS5 version when using same settings.

Those 8GB cards are scoring higher in "average" FPS but the 10GB+ memory usage is forcing those cards to load in assets during gameplay and resulting in bad looking/janky frametime graphs (which can translate to bad stutters or sudden slowdowns while in-game). Check out the 0.5% and 0.2% minimum FPS and you'll see that 8GB RTX 3070 is losing to the 12GB 6700XT.

Also I don't remember any other game that uses 10GB+ memory in 900p resolution. What is going on behind the scenes?
 
The 30XX series VRAM situation pisses me off. 8GB on a 3070ti is absurd. I get that some VRAM usage in certain games is actually allocation and not usage but still. 8GB VRAM on these cards is a slap in the face to an already shitty market situation.
 
Low quality post by P4-630
Seems like @W1zzard miss the RT AO as it was well hidden within the menu (with AMD CACAO as default). Overall RT AO doesn't cost much FPS and it looks better than CACAO and HBAO+

View attachment 217348

Now and then some balanced/quality cacao, I like it. :D

Capture.PNG
 
The 30XX series VRAM situation pisses me off. 8GB on a 3070ti is absurd. I get that some VRAM usage in certain games is actually allocation and not usage but still. 8GB VRAM on these cards is a slap in the face to an already shitty market situation.
and yet at 4K 3070 matches 6800 non-XT in Deathloop
 
The game is not challenging at all, I always play games at highest difficulty so beating the game is its own reward. There is no such thing with Deathloop, no wonder game journalists like this game so much while they despise Doom Eternal

if I were to play a game only for its story I might as well go read a book ;)

Yeah same, I want games to actually feel like a game instead of being a glorified movie with controls tackled on.
 
and yet at 4K 3070 matches 6800 non-XT in Deathloop

vram.png


Seems like the VRAM dedicated usage top out at some point, DF measured some performance penalty with texture streaming (~10%) so only the 3090 will come out on top in an extended gameplay, kinda funny when the entire game folder is only 30GB.
Note: I use FOV of 110 so it might use more VRAM
 
frametime.png


As usual, higher averages don't tell the whole story. Even in 1440p the frametime graph of RTX 3080 is a mess compared to how stable the RX6800XT looks.

If this isn't a driver issue then the 3080 is probably struggling with asset loadings during gameplay due to high VRAM usage.
 
View attachment 217379

As usual, higher averages don't tell the whole story. Even in 1440p the frametime graph of RTX 3080 is a mess compared to how stable the RX6800XT looks.

If this isn't a driver issue then the 3080 is probably struggling with asset loadings during gameplay due to high VRAM usage.

I don't know what planet you are from but frametime variance of 2ms (from 12-14ms) is simply undetectable with human perception.
So yeah while the graph for 6800XT looks tighter, it doesn't mean anything. What you should look for is when there are frametime spike of high magnitude (>30ms) that indicate micro stutters.

3070 maintains higher 0.2% percentile than RX6800 at 1440p, so frame delivery is good, at 4K pretty much every GPU but the 3090 will run out of VRAM in extended gameplay as I have tested (~20GB VRAM usage).
base.png
 
Last edited:
In most of those scenes I could barely see anything ray tracing changed. Its good to see that its called out finally. Its useful when you do modeling or something but for games its still not there yet really.
 
So wait, your on a time restraint and the progress resets if you die?
 
LOL! At this stage only big reviewers like TPU and the 'miners' can get top tier cards. It's just like watching F1! :D Super fast car(d)s but you know you won't be getting (in) one anytime soon.

At least I ditched gaming a long time ago, so at this point it's just free lulz for me, this sutiation.
 
Why is this PS4 looking game so demanding? :kookoo:
Bad game engine, they are still using the Void engine, which is a modified Id Tech 5, one of the worst game engines, nearly all of its games had performance problems and bad pc ports, whats really puzzling is they have access to id tech 6 and 7, both are marvelously well optimized engines, why couldn't they migrated to newer tech is beyond me...

"Needs 10GB VRAM for 900p". LOL. Every game Arkane have made with the Void Engine (Dishonored 2, Deathloop) has launched to complaints about performance. And every other highly rated well optimised games Arkane have made (Dishonored (Unreal 3), Prey (CryEngine 4), etc), have all been made on something not-Void. Should have just stuck with Unreal.
Bethesda mandates them to use Id tech, void engine is id tech5, why not upgrade to id tech 6 or 7 ?
Only they know
 
zp5tm1qbi6231.jpg

with RT off I see nothing significant in frametimes, which matches @javanoia's chart (look at his chart red vs green line)

will do a run with RT enabled later, on both cards, for more data

will run out of VRAM
Once again, I think this might be intentional. I suspect the algorithm is something like:
"keep loading assets into vram. only when <10% vram left, unload assets, oldest first or least often used first" this is a typical garbage collector implementation

the logic here being that unused vram doesn't do anything for you. so rather try to keep as many assets in memory as possible and gc/evict as needed. COD uses something similar
 
Last edited:
with RT off I see nothing significant in frametimes, which matches @javanoia's chart (look at his chart red vs green line)

will do a run with RT enabled later, on both cards, for more data


Once again, I think this might be intentional. I suspect the algorithm is something like:
"keep loading assets into vram. only when <10% vram left, unload assets, oldest first or least often used first" this is a typical garbage collector implementation

the logic here being that unused vram doesn't do anything for you. so rather try to keep as many assets in memory as possible and gc/evict as needed. COD uses something similar

Can you test PCIe bandwidth scaling with this game? this game probably use texture streaming so effectively that lower VRAM GPU can play just fine with Very High Texture Quality, albeit with some performance loss. DF has mentioned this in their video.

But probably no one cares about this game at this point :twitch:
 
Last edited:
LOL! At this stage only big reviewers like TPU and the 'miners' can get top tier cards. It's just like watching F1! :D Super fast car(d)s but you know you won't be getting (in) one anytime soon.

At least I ditched gaming a long time ago, so at this point it's just free lulz for me, this sutiation.

IMO no point spending the money when I can watch entire games free on streaming without dealing with DRM, game stores, loading times, disk space, heat/power/cooling as bonuses
 
I enjoy the game. Lots of fun. Really one of the first games I've been able to play around with raytracing + FSR 1.0. The gameplay is fun. The shoot em' up approach tends to be a fun ruiner to me, just stick with the stealth play. I like the story and the humor.


Thanks for the review.
 
I always find it interesting when similar situations like this come up. In this game, it's that RT is tacked on to an already 'meh' visually game, with very little additional visual benefit, plus a substantial performance cost, for an odd choice of effects, just sun shadows, and AO. Games with terribly implemented/performance tanking visual features aren't new. I mean you can find very modern examples where the highest volumetric fog/cloud/lighting effects utterly tank performance, yet setting them down from say 'Ultra' to 'high' or 'medium' comes with a negligible visual impact, but you get disproportionately more performance back for that trade, and I don't hear people saying volumetrics are dumb and gimmicky. To me, this game's implementation essentially changes nothing for the trajectory of RTRT in video games, just a poor example to add to the poor example pile, for a variety of reasons, unfortunately.
 
this is the absolute opposite of a good RT game......
 
The game looks good, but it's got the Denuvo DRM infection, so I'll give it a miss. I've literally got hundreds of others to play.

+1 on that, forgot to mention in my article
Shame on you. Now go and sit in the corner. :p
 
So AMD has adaptive resolution with FSR? That is awesome. I have been waiting for NVIDIA to add this to DLSS for ages. Dynamic resolution scaling is one of the best rendering features.
 
1DWSce7.png


well........ resolution 2560x1440 = need more than 10GB vram..... i will skip this game.....
 
Have you heard about caching? A lot of games do it these days.
 
Back
Top