• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

The First Descendant Performance Benchmark

A CPU test would be better. There are many reports on Steam (mine included), where the CPU is constantly at 100% load.
I have a 5800X3D and sometimes it is at 100% load and below 60FPS.
i'm not seeing that kinda load on the cpu in my capture above but I need to play some more.
 
i'm not seeing that kinda load on the cpu in my capture above but I need to play some more.
Might be a problem with SMT, since I had it disabled for Elden Ring. SMT off seems to reduce my lag in ER.
I enabled it again for TFD, but since the servers are down, I can't test it right now. I have time in about 12 hours. :D

But I can say, it was really pinned at 100% load, which is very rare outside of benchmarks for me.
 
Might be a problem with SMT, since I had it disabled for Elden Ring. SMT off seems to reduce my lag in ER.
I enabled it again for TFD, but since the servers are down, I can't test it right now. I have time in about 12 hours. :D

But I can say, it was really pinned at 100% load, which is very rare outside of benchmarks for me.

For me it sits at about 20% on a 7950X3D or around 30-40% on the Vcache ccd. During loading it kicks up a bit but otherwise stays pretty consistent.
 
It seems al the UE5 title are 60 fps experience unless you enable upscaling...
Well, yeah, Epic sorta implies that too. To my knowledge, they even heavily incentivize using it by making their own upscaler, TSR, a default go-to fallback for devs if no GPU vendor upscaler is present. It essentially replaces the fairly awful TAAU from UE4. And I remember DF guys mentioning that in their talks with UE5 devs it was told to them that UE5 kinda expects and is built with upscaling in mind. So it seems that they feel like native rendering is not going to be the way forward. And, frankly, seeing how GPU performance, especially at mainstream price points, can’t really keep up with performance demands of graphically intense titles - it makes sense.
 
Diablo with guns in 3D is sort of what Hellgate: London was supposed to be.

Sadly, that flopped because it was rushed to release as an incomplete mess with server issues, but the premise was solid.
That wasn't rushed, it was just abandoned mid-development. I mean, there wasn't a single recognizable landmark in the entire game, that's how far from completion it was.
 
After playing a little I failed to find any clear and significant difference between RT and normal.
 
After playing a little I failed to find any clear and significant difference between RT and normal.
Often the way unless there are many highly reflective surfaces in the game. Screen-space reflections have their limitations and RT reflections are significantly more realistic a lot of the time.

RT lighting and shadows seem almost pointless to me because it's a very GPU-intensive way to calculate what we can already get very cheaply with shader/screen-space alternatives for. Hell, pseudo-traced dynamic shadowmaps often look better than RT shadows, and even high-resolution ones are cheaper in terms of GPU power than RT shadows, whilst being temporally-accurate per frame, unlike RT anything.

My only criticism of screen-space lighting and shadow effects (the temporal delay) is true to an even worse extent with raytraced lighting and shadow effects (thanks to needing many necessary frames of denoiser) so the slight delay/lag of screen-space shadows and occlusion aren't even a downside when the alternative is even laggier RT shadows!
 
For me it sits at about 20% on a 7950X3D or around 30-40% on the Vcache ccd. During loading it kicks up a bit but otherwise stays pretty consistent.
I was strange yes. In the beginning I had >80FPS, but then <60FPS. V-Sync was also not working in that moment, maybe there was some sort of engine bug.
I use my second monitor while gaming, so a lot happens while gaming.
 
Last edited:
Another game with RT that makes it unplayable with any GPU under ÂŁ800, Nvidia really trying hard to price everyone out of PC gaming.

At least this one is free, i don't have to stick it on my Wishlist for when i get an RTX 7060.
 
Another game with RT that makes it unplayable with any GPU under ÂŁ800, Nvidia really trying hard to price everyone out of PC gaming.

At least this one is free, i don't have to stick it on my Wishlist for when i get an RTX 7060.
Even with a 4090 I'm not sure I'd be turning on RT. I like crisp, high-resolution gaming at high framerates and anyone with the budget for a 4090 has likely already invested in a 1440p or 4K high-refresh panel. The last thing they want to do is waste that refresh rate and resolution on expensive RT effects that require low-res upscaling...

But that's just an opinion. If you like the subtle visual difference and are prepared to pay the performance penalty then you do you; I'm not judging...
 
Even with a 4090 I'm not sure I'd be turning on RT. I like crisp, high-resolution gaming at high framerates and anyone with the budget for a 4090 has likely already invested in a 1440p or 4K high-refresh panel. The last thing they want to do is waste that refresh rate and resolution on expensive RT effects that require low-res upscaling...

But that's just an opinion. If you like the subtle visual difference and are prepared to pay the performance penalty then you do you; I'm not judging...
What's your monitor that you game on?
 
Even with a 4090 I'm not sure I'd be turning on RT. I like crisp, high-resolution gaming at high framerates and anyone with the budget for a 4090 has likely already invested in a 1440p or 4K high-refresh panel. The last thing they want to do is waste that refresh rate and resolution on expensive RT effects that require low-res upscaling...

But that's just an opinion. If you like the subtle visual difference and are prepared to pay the performance penalty then you do you; I'm not judging...
This right here.

And its why I choose 1440 UW I prefer a 100 - 144 fps experience than 4k 60.

That is native also no upscaling or FG needed.
 
Last edited:
What's your monitor that you game on?
I'm on an older Odyssey G7. It's no longer the fastest, but it's still decent.


I should point out that I don't game on a 4090. I've had various RTX cards since the first generation and peaked with a 3090 for a few months, but it was crap. Hot, power-hungry, and all of the RT effects in games at the time looked kinda bad to me. Once I'd finished with the VDI testing I was using it for, I returned it to the office and put my cool, quiet, adequate RTX 3070 back in.

Currently using a 7900GRE which is great for 1440p120 with strobing backlight for motion clarity that's hard to argue with. I think it might even be better than my OLED TV in the other room, that lacks strobing.
 
Last edited:
Min fps with AMD cards are horribly bad, that may be more important than the average fps. Probabaly lots of stutter and lag with the red side.
It was a lucky decision, that few months ago i went with a used 3080 for 350euro instead of saving money for a 7700xt, even with newer drivers, the red side still looks like poop.
 
Last edited:
Min fps with AMD cards are horribly bad, that may be more important than the average fps. Probabaly lots of stutter and lag with the red side.
It was a lucky decision, that few months ago i went with a used 3080 for 350euro instead of saving money for a 7700xt, even with newer drivers, the red side still looks like poop.
Are we looking at the same review? The 3080's min FPS is lower than its competition the 6800XT.
If you are comparing used VS new there are used AMD gpus too.
1720156769663.png
 
Are we looking at the same review? The 3080's min FPS is lower than its competition the 6800XT.
If you are comparing used VS new there are used AMD gpus too.
View attachment 354105
lol I saw this too that guy is just a random troll.

After playing for a few hours last night RT is not really worth putting on in this game as didn't see much difference.

Also when you get in a lobby or out in the field with multiple characters your fps takes a hit.

Even with enough characters on screen without RT you take a noticeable hit to FPS.
 
lol I saw this too that guy is just a random troll.

After playing for a few hours last night RT is not really worth putting on in this game as didn't see much difference.

Also when you get in a lobby or out in the field with multiple characters your fps takes a hit.

Even with enough characters on screen without RT you take a noticeable hit to FPS.
On RDNA3 I find that RT performs alright on high, the max RT setting just destroys performance while I struggle to tell the difference.
From the screenshots comparison it seems to affect shadows, but shadows is often the first thing people turn down in a shooter.
 
On RDNA3 I find that RT performs alright on high, the max RT setting just destroys performance while I struggle to tell the difference.
From the screenshots comparison it seems to affect shadows, but shadows is often the first thing people turn down in a shooter.

I think the difference is obvious even going from high RT to ultra RT but the drop in performance for this kind of game is not worth it medium/high seem like the sweet spot.

Game still looks good even if RT is off though.
 
I think the difference is obvious even going from high RT to ultra RT but the drop in performance for this kind of game is not worth it medium/high seem like the sweet spot.

Game still looks good even if RT is off though.
It is obvious on the second scene, but on the first and third scene the difference is much more subtle.

This game being Unreal 5 might support both Software and Hardware versions of Lumen.
So turning RT off could just means it run the software version, that would explain they very high CPU usage others have reported.
 
Last edited:
On RDNA3 I find that RT performs alright on high, the max RT setting just destroys performance while I struggle to tell the difference.
From the screenshots comparison it seems to affect shadows, but shadows is often the first thing people turn down in a shooter.
Even on medium or high I saw my average hitting 50fps with heavy action. And in the middle of a fight that is very noticeable so RT stay off for me.

I'm also expecting AMD's next driver to be game ready for this so we may see more performance.
 
Even on medium or high I saw my average hitting 50fps with heavy action. And in the middle of a fight that is very noticeable so RT stay off for me.

I'm also expecting AMD's next driver to be game ready for this so we may see more performance.
The current 24.6.1 driver already has support for this game.
What this game needs is and update to FSR 3.1 from the current 3.0, and of course better optimization.
As far as Unreal 5 games goes this game's performance isn't the worst but can be better.
 
It is obvious on the second scene, but on the first and third scene the difference is much more subtle.

This game being Unreal 5 might support both Software and Hardware versions of Lumen.
So turning RT off could just means it run the software version, that would explain they very high CPU usage others have reported.

I was talking about in game testing it in a variety of areas up to the Jungle green ish looking area.... To me it's obvious but also not worth the performance cost as it's mostly due to superior AO coverage making the largest difference the GI is very area dependent.
 
The current 24.6.1 driver already has support for this game.
What this game needs is and update to FSR 3.1 from the current 3.0, and of course better optimization.
As far as Unreal 5 games goes this game's performance isn't the worst but can be better.
you are right my bad 24.6.1 does support but more optimization is needed for sure from both AMD and the Dev side I think. I've noticed many areas in game with inconsistent performance.
 
Back
Top