• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Lords of the Fallen Performance Benchmark

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,665 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Lords of the Fallen has been released, showcasing the incredible power of Unreal Engine 5. This souls-like game offers breathtaking visuals that rank among the most impressive we've seen, but it demands powerful hardware, too. In our performance review, we're taking a closer look at image quality, VRAM usage, and performance on a wide selection of modern graphics cards.

Show full review
 
So my GPU would do about 19fps @ 1440 with RT.....

Welp, not buying anyway. :D
 
Getting these stutters that you can see in Daniel Owen's video at 14:13 a lot... sometimes multiple of them in a row and it can happen even during fights because enemies chase you into transition zones. Also, I'm very CPU limited in the hub area for some reason and get bad frame rates but only there.

[link removed]

I'm on a 5600x and 7900 XT and I feel like the 5600x is pretty outdated now although in Daniel Owen's video the stutter happens even with a 7800x3d.

That said, the game does look amazing and most of the time when I'm within a single area it's very smooth.
 
Last edited:
that you can see in Daniel Owen's video at 14:13 a lot
I ran through this area many times .. kept failing the first boss quite often t.t .. no such stutter here .. just retried .. nothing
 
ICYMI, the graphs have "Ultraw" and "lowh" written, which I doubt were intentional

As for the performance, jesus fucking christ. The game looks quite average, why the fuck is performance this abysmal? Why have developers suddenly forgotten how to optimize their games? Or is it the unreal engine? Or is it that the ultra settings without RT turn on something that tanks performance without visual return? Because the scalability on display is impressive for low vs high graphics (though I'd argue that low should be closer to the actual performance judging by how the game looks), nanite perhaps?

Another UE5 game, another performance disasterclass, I am shocked that it's this bad on relatively high end hardware at 1080p
 
What, another AAA release that does just fine with 8GB VRAM? But, but, but...
 
ICYMI, the graphs have "Ultraw" and "lowh" written, which I doubt were intentional
I did, but was too lazy to fix because I didnt save the chart inputs, and thought "nobody will ever notice" .. fixed
 
What, another AAA release that does just fine with 8GB VRAM? But, but, but...

Yeah, but to be honest some of the textures like wood besides the barrels look like from Morrowind.
 
honestly, the future is looking grim...this performance for a game that looks virtually no better then Dark Souls 2....
christ is this engine just super inefficient or what?

also is it not a bad idea to run Frame generation in a game like Dark Souls? like you would not want that for competitive games and this I would think is no suited to it for the same reason.
Maybe they are just hoping to dupe people with it to make up for the generally poor performance, idk.

EDIT, though I do want to say it does seem to scale decently with settings, so thats fine.
 
Last edited:
Since it’s generally accepted on the internet that Radeon drivers don’t work, @Wizzard, how many times did the game crash and display graphical corruption when testing Radeon cards?
Ah, the classical make up an accusation, so you can easily refute it immediately.

And to explain that a bit. If it was known/accepted "drivers don't work", nobody would be buying those cards. Really simple.
What is known/accepted is ATI/AMD drivers were historically more faulty and would incur a higher overhead. What is also known/accepted is that over time they got up there with Nvidia and they're now trading blows. The only significant thing I can name about AMD drivers otoh is they seem to regress idle power draw every now and then.
 
Since it’s generally accepted on the internet that Radeon drivers don’t work, @Wizzard, how many times did the game crash and display graphical corruption when testing Radeon cards?
Didn't encounter any crashes or similar issues on any cards. Check the Steam forums, maybe it's just me
 
The "more accurate ray tracing methods" descriptions are in reference to Lumen lighting and reflections. Specifically, this game uses software lumen only. Lumen is still considered a form of ray tracing, but it is not hardware accelerated in this game (it doesn't need to be due to the simplified nature of Lumen's ray tracing). Technically, you can even use those settings with GTX graphics cards without too severe of a hit to your frame rate. (edit: i mean, the frame rate will probably be poor in general, but the hit from turning on high GI or reflections won't be heavier than other cards)

If you include this game in the GPU benchmarking suite, I would recommend the ultra preset for GI and reflections for the basic test suite.
 
Last edited:
Specifically, this game uses software lumen only
Do you have a source for that? But I agree with you, also RX 5700 XT being able to run it. Even though UE has various fallbacks
 
The beauty of Unreal Engine 5 is that you don't need to run it at native resolution for it to look good, it's designed this way. The built in resolution scaler (Not DLSS or FSR) works great and looks as good as native down to 60% resolution at 4K. I haven't tried with DLSS on my 3080 12GB, but FSR has issues with particles, so I decided to use the built in scaler. I'm playing on a 6950XT at 4K with everything except global illumination on ultra at 65% resolution and I stay above 60fps with the only drops in the home base area while in Umbral. I think UE5 needs to market its resolution scaling far more, it's quite good.
 
Do you have a source for that? But I agree with you, also RX 5700 XT being able to run it. Even though UE has various fallbacks
I guess I don't, I'm just basing this off of my knowledge of how UE5 works from screwing around with it a little but I'm not an expert. As far as I'm aware, the stock "Global Illumination" setting in UE5.2 just controls Lumen in its software mode. Specifically, high, ultra, and epic are software lumen and low and medium are a very basic fallback. Hardware Lumen is a separate toggle you must enable in the settings, and this game doesn't have that setting anywhere, nor is it in the ini. The game also doesn't have the epic quality option available, though you can do an ini edit to enable it (don't—it performs very badly and barely improves anything)

Software Lumen uses a signed distance field while hardware Lumen replaces that with a BVH. Cards like the 5700 XT should choke and fail miserably when trying to run HW Lumen, so if it can run high and ultra quality at the same level of playability as the 6600 XT, then it should be SW lumen.
 
Hardware Lumen is a separate toggle you must enable in the settings, and this game doesn't have that setting anywhere, nor is it in the ini
Could it be that the setting is just a front for multiple settings that get adjusted internally depending on its value?
 
Since it’s generally accepted on the internet that Radeon drivers don’t work, @Wizzard, how many times did the game crash and display graphical corruption when testing Radeon cards?
"generally accepted on the internet on the internet that Radeon drivers don’t work"
They write a lot on the internet,the issue is that it is difficult to understand people if they are telling the truth or not,because generally there is a lot of toxicity even in hardware matters,so after much thought I decided that only if I try the product myself will I have a real picture and my experience is as follows,I have 5 PCs with 1080gtx, rtx 2070, 5700xt, 6800 and 6900xt apart from the usual problems that are solved through the drivers,I have not faced anything that would make me definitively choose one company over the other specifically say that I run the 6900xt together with MPT(MorePowerTool) and the card is the same fast in raster as a 3090ti and similar performance in rtx with a 3080 without problems with 24/7 use,that's how stable the AMD drivers are .
 
vram.png

we need more than 8gib of vram they said
 
The beauty of Unreal Engine 5 is that you don't need to run it at native resolution for it to look good, it's designed this way. The built in resolution scaler (Not DLSS or FSR) works great and looks as good as native down to 60% resolution at 4K. I haven't tried with DLSS on my 3080 12GB, but FSR has issues with particles, so I decided to use the built in scaler. I'm playing on a 6950XT at 4K with everything except global illumination on ultra at 65% resolution and I stay above 60fps with the only drops in the home base area while in Umbral. I think UE5 needs to market its resolution scaling far more, it's quite good.
Now go to lower resolutions, or realize that many people (such as myself) don't like running upscalers because they cannot really offer the sharpness of a native image, which also come with way less ghosting (if the AA method is not garbage). 60% of 4K upscaled might look nice, but try doing that at 1080p and you will be greeted to horrors unseen, literally, it looks like smeared vaseline. That and, even on beefy gpus one cannot run at even native 1440p at acceptable framerates, when contemporary games that run on proprietary game engines may at times look better than most UE5 games and run light years better too, see for example: MWIII (a beta!), CS2, Half life: Alyx, F1 2023, Control, RDR2. While looks are mostly subjective I will concede, all the games listed run light years faster than most UE5 games while looking arguably better in many cases, though again, subjective.
 
I cannot believe I would barely get 100 fps with a 4090 in 1440p. I get over 100fps with a 2070 SUPER in Ultra settings + Ray Tracing in games like Dying Light 2.
So I'm skipping this gen of graphic cards AND this specific game until they learn how to use game engines properly.
 
The beauty of Unreal Engine 5 is that you don't need to run it at native resolution for it to look good, it's designed this way. The built in resolution scaler (Not DLSS or FSR) works great and looks as good as native down to 60% resolution at 4K. I haven't tried with DLSS on my 3080 12GB, but FSR has issues with particles, so I decided to use the built in scaler. I'm playing on a 6950XT at 4K with everything except global illumination on ultra at 65% resolution and I stay above 60fps with the only drops in the home base area while in Umbral. I think UE5 needs to market its resolution scaling far more, it's quite good.

It's kinda the point I've been trying to make a good long while, which is we're headed into the era of FSR/DLSS 'balanced' (~59% scale) for a lot of the more intensive/ue5 games. I haven't played this title personally, but I agree that the setting generally does not look bad at all IMHO regardless of upscaling tech, and given how Starfield/UE5 Fortnite were received (I haven't seen anyone complaining about the resolution on XSX) I don't think most people have a problem with it either. It makes sense when you think about it, as the resolutions are 2227*1253 (dlss) and 2259*1270 (fsr), which are essentially the max of the THX spec regarding distance per size/visual acuity, 40 degree arc, or as I like to put it: When you sit close-enough to a tv that it fills your whole vision without needing to move your head...that resolution should still be fine for the vast majority of people. I also think it's weird reviewers, if they test FSR/DLSS at all, generally go straight from 'quality' (66%, which is still often overkill imho) to "performance" (50%)....which can start to look a little questionable. It's almost like the modes match what they say on the tin, but people don't appear to care. Thanks for being a voice of reason.

I find it amusing that people are using this game as an example of why people don't need more than 8GB of VRAM. First of all, the texture quality in this title isn't particularly good. Second of all, let's not act like every game successfully pulls apart system/vram allocation from the shared ram pool of a console. Ideally, yes, and in that case because of the whatever it is, ~14GB available to a console should generally equal an ~8/6GB split, but you need to realize that just doesn't always happen; some PC ports require 10-12, rarely 12+GB atm of one or the other if not both. If you want to criticize all of those developers for their PC ports that's fine, but it doesn't change the reality that it has and will happen, especially as we move to lower native resolutions on current-gen consoles (which will happen not only bc ue5 is heavy and/or so there is the perception of games doing more and are looking better on the same hardware, but to capitalize on the potential of a ps5pro) or more intensive settings on PC. The one thing that truly might save you is the fact the XSS has (really 8GB, 2GB is 32-bit) 10GB of ram and developers are forced to release a port for it if they join the Xbox ecosystem. That doesn't, however, mean that that console won't eventually fall into 720p territory. It's already sub-900p in a lot of instances, even if you try to make the arguement 1080p is the general aim. You can fight it all you want, but the reality is that not only is 8GB questionable for 1080p, but 12GB is questionable for 1440p when matching GPU performance to it's buffer. There absolutely already is and will be more instances in the future a 4070ti will be below 1440p/60 simply because of the buffer; that's a card that seriously could use 13-14GB but doesn't have 16GB because product segmentation and the art of the upsell to the 4080. We can agree to disagree, but let's revisit this topic after the release of the PS5pro, Blackwell, and 3nm Navi. Heck, maybe after the release of Battlemage/Navi4x (which will likely match their performance to 16GB exactly and aim for 1440p; relegating 4080 to the same market eventually). Those cards will almost certainly be 16GB, and the next-gen likely 16-32GB. When you stop and realize the next consoles themselves will probably be 32GB, and the then older consoles then likely relegated to (1080p-)1440p maximum, it all starts to make sense. I'm not saying you're completely wrong at the moment, but wrt the fairly-immediate and especially longer-term future (especially in terms of spending $400+ on a video card) I would certainly opt for obtaining a larger buffer. People can choose to believe 4070ti is not planned for obsolescence by having 12GB or 4070 having the performance of a card that only needs ~10-11GB, but those will be the same people nVIDIA will target their fomo using settings unplayable on them when Blackwell releases. That's just (how nVIDIA runs their) business of selling new video cards, which is important to understand is incredibly savvy but also incredibly shitty for customers. They know that people will forget about this conversation when the next-generation releases and 10-12GB cards are old news and/or relegated to 1080p, just as they have now done several times before with 3/6GB cards. To bring it full circle, I and likely other people will then recommend people run those cards at 4k DLSS balanced, or roughly 59% scale (as Colossus recommends), if not 50%/'performance'/1080p.
 
Last edited:
Ah, the classical make up an accusation, so you can easily refute it immediately.

And to explain that a bit. If it was known/accepted "drivers don't work", nobody would be buying those cards. Really simple.
What is known/accepted is ATI/AMD drivers were historically more faulty and would incur a higher overhead. What is also known/accepted is that over time they got up there with Nvidia and they're now trading blows. The only significant thing I can name about AMD drivers otoh is they seem to regress idle power draw every now and then.
That is not the current ‘conventional wisdom’ at all. The TPU comments on Radeon content are filled with ‘I just wish Amd drivers didn’t suck’ or something similar. I appreciate Wizzard and others providing feedback in this review of not finding any significant problems using Radeon cards.
 
Back
Top