• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Dead Space Benchmark Test & Performance Analysis

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,653 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Dead Space is a remake of the cult horror classic, with modern next-gen graphics based on EA's Frostbite engine. In our performance review, we're taking a closer look at image quality, differences between the graphical setting presets, VRAM usage, and performance on a selection of modern graphics cards.

Show full review
 
"VRAM usage on AMD and NVIDIA is very similar and pretty serious. Even at 1080p, you're above 10 GB, which increases further with RT enabled."

@W1zzard I think this is a hold-over from your last analysis since VRAM doesn't go above 10GB. On the performance slide.
 
well done remake, optimised it seems, but all i see on ultra settings is more fog and shadows, while the later is good the prior is not what i call graphic improvement.
is that odd that i find some scene better in lowest settings?

vRAM usage is fine at 1440p ... hummm i got Dead Space free on origin ages ago ... maybe the remake, depending on the price, will tempt me ...

@Lightofhonor oh, dang you are right :laugh: and 6.2/6.4gb is certainly not 10gb :laugh:
 
no graphs for RT on?

also the slides between low and ultra remind me a lot of the frostbite running Battlefield games, they barely seem to scale which means it will either run well or it wont, not much in between.
Basically some RT shadows and some fog/mist on the floor....for the rest its mostly exactly the same to my eye.
 
Last edited:
no graphs for RT on?
i guess since Ulra settings has RTAO RT is already featured for the performances ... although it's between lowest settings and ultra settings instead of custom no RT and ultra with RT

well RT bring nothing much in that one, shadows are a bit more detailed in ultra but that's all i see (maybe i'm blind)
 
Performance and graphics are worse than Calisto Protocol.
 
Performance and graphics are worse than Calisto Protocol.
graphics i might agree ... but performances? does Dead Space "feature"a shader compilation issue like Calisto Protocol, which was horrible in term of performances at launch (without heavy tweaking)

although this is Dead Space, it has a value magnitudes over Calisto Protocole in almost every other aspects aside the one you mention,
i like Calisto Protocole but Dead Space is legend and in my mind Calisto Protocole is either a wanabe (with worse story, lore and execution) or a decent hommage to Dead Space (Calisto Protocole is a "Dead Space lite" instead of a "Dead Space like", if you prefer)
 
Love the game, but the performance isn't that great considering RT is off. And have no idea why AMD didn't bother releasing a game ready driver for the RX6000 series, those guys should step it up a bit.
Great content as usual W1zzard.

graphics i might agree ... but performances? does Dead Space "feature"a shader compilation issue like Calisto Protocol, which was horrible in term of performances at launch (without heavy tweaking)

although this is Dead Space, it has a value magnitudes over Calisto Protocole in almost every other aspects aide the one you mention,
i like Calisto Protocole but Dead Space is legend and in my mind Calisto Protocole is either a wanabe (with worse story, lore and execution) or a decent hommage to Dead Space (Calisto Protocole is a "Dead Space lite" instead of a "Dead Space like", if you prefer)
Well said.
 
VRAM usage on AMD and NVIDIA is very similar and pretty serious. Even at 1080p, you're above 10 GB, which increases further with RT enabled.

This sentence in page 5 is from the Forspoken benchmark análisis, pls make the correction
 
Last edited:
Last edited:
And have no idea why AMD didn't bother releasing a game ready driver for the RX6000 series, those guys should step it up a bit.
maybe because they focus on getting the 7XXX series in priority and since the combined driver for 6xxx/7xxx did some damage (aparently) to the 6xxx i guess they are taking time to fix what is not working for the 6xxx (not that i really care ... 22.11.2 are fine for most games ... i play, but i will gladly welcome new drivers later )

Great content as usual W1zzard.
yep, agreed.

Well said.
thanks.
 
Performance and graphics are worse than Calisto Protocol.

maybe really raw graphics, but presentation to me is MUCH better in the dead space, the entire sense of space is just way better, its an actual ship you are on, not some linear hallway with bright yellow spikey walls strategically placed......

so no, I dont agree with you at all.

and dead space is and was just in general a much better game, hell Callisto Protocol does not even have a right to exist imo.
 
My guess is this is one of the games where if you just turn down a few settings from Ultra to High, the FPS will jump dramatically, that's usually how this crap works, and the game usually ends up looking better on top of that too. Had that experience recently with God of War, some settings I left at Ultra, some I turned down to high, ambient occlusion I turned to medium and couldn't even tell a difference in-game. and my fps jumped by 40.

Companies these days only care about their e-peen, remember to adjust your settings and you will still get high FPS ~
 
Yet another game where the 6890xt 6900xt is barely beating the 3080 at 1080p/1440p when it used to be faster then a 3090.

AMD has gotta get on it, there's no reason 6000 series owners should be missing out on game optimization.
 
Last edited:
It's kind of ridiculous to complain about performance, when you don't have results for the game at medium settings. Frostbite is very scalable. Mass Effect Andromeda and Battlefield both take a lot of resources at maximum settings but are pretty easy to run at medium.
 
This sentence is from the Forspoken benchmark análisis, pls make the correctios
"VRAM usage on AMD and NVIDIA is very similar and pretty serious. Even at 1080p, you're above 10 GB, which increases further with RT enabled."

@W1zzard I think this is a hold-over from your last analysis since VRAM doesn't go above 10GB. On the performance slide.
fixed

My guess is this is one of the games where if you just turn down a few settings from Ultra to High, the FPS will jump dramatically
Check the comparison screenshots, FPS numbers listed. Basically little difference for almost no visual difference

no graphs for RT on?
It looks the same, the performance hit is very small, not worth wasting the time to bench

It's kind of ridiculous to complain about performance, when you don't have results for the game at medium settings

I guess we could turn this into a TPU-style bar chart to make it easier to use for everyone
 
It's kind of ridiculous to complain about performance, when you don't have results for the game at medium settings. Frostbite is very scalable. Mass Effect Andromeda and Battlefield both take a lot of resources at maximum settings but are pretty easy to run at medium.
Although no one will disagree with the weight of "Dead Space", the performance and graphics ratio is not good. Basically only the best GPUs on the market can reach 60fps in 4k :p
 
fixed


Check the comparison screenshots, FPS numbers listed. Basically little difference for almost no visual difference


It looks the same, the performance hit is very small, not worth wasting the time to bench



I guess we could turn this into a TPU-style bar chart to make it easier to use for everyone

I maximized and went back and forth, even Lowest vs Ultra I almost prefer lowest... wth... can't say I ever experienced that in a game before. Shadows look better in Lowest... ugh my head hurts, game companies just make no sense anymore.

Ultra does look a little bit better on certain things, like the first image aliasing on the table, but still... wth lol
 
Although no one will disagree with the weight of "Dead Space", the performance and graphics ratio is not good. Basically only the best GPUs on the market can reach 60fps in 4k :p
Funny, the original still looks great and runs on a literal potato.
 
Funny, the original still looks great and runs on a literal potato.

yeah a lot of people forget too, that when you run old games like Dragon Age Origins or original Dead Space on a 27" 1440p monitor or 32" 4k --- it scales so fucking nicely.

legit I was playing Dragon Age Origins the other day, and it looked like a next gen game, just because of the extra clarity that the 1440p native jump gives it.
 
yeah a lot of people forget too, that when you run old games like Dragon Age Origins or original Dead Space on a 27" 1440p monitor or 32" 4k --- it scales so fucking nicely.

legit I was playing Dragon Age Origins the other day, and it looked like a next gen game, just because of the extra clarity that the 1440p native jump gives it.
Somehow all the low level APIs and enhanced hardware have given rise to a whole generation of poorly coded garbage. I can only imagine what current gen hardware could do in the hands of people with the drive to optimize as well as the 360/ps3 era.
 
Most cards nowadays have 8 GB and (much) more VRAM, which means 7 GB at 1440p and 8 to 9 GB at 4K are really not a problem.

It's debatable.

The two most popular gaming cards are still GTX 1650 and GTX 1060. In fact most cards in top 10 are just 6GB.
 
It's debatable.

The two most popular gaming cards are still GTX 1650 and GTX 1060. In fact most cards in top 10 are just 6GB.
Given the performance of the 3060 at 1080p the 1060/1650 are likely unusable in this game.
 
Reflections on the two highest settings look like garbage and remind me of that similar grainy look on that setting in resident evil 2 before they did the RT update. Put reflections on medium and that issue goes away. The other issue is with vrs in the game and that will be addressed in the next patch. And I don't know how tech power up get such low vram usage in so many of their reviews as I always see much much higher usage in games. For instance after about 10 minutes of playing at 4K and even dlss on quality I'm seeing over 11 gigs of vram usage on a 3080 TI.
 
Back
Top