• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Immortals of Aveum Benchmark Test and Performance Analysis

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,831 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Immortals of Aveum is built using Unreal Engine 5 and has both Lumen and Nanite enabled, which achieves impressive graphical effects. In our performance review, we're taking a closer look at image quality, VRAM usage, and performance on a wide selection of modern graphics cards.

Show full review
 
what a masterpiece of optimization.
 
Wow that is bad...7900XT or RTX 4080 needed for 60FPS avg @ 1440p / very high settings...
 
I saw the writing on the wall, this seems to happen with every UE update... My two year old GPU is going into my son's computer to be a 1080p superstar as soon as the GDDR7 GPUs are available. /sigh
 
  • Like
Reactions: N/A
hardware requirements are insane, which is fine if this was a new Crysis...but it REALLY isnt, it looks...well imo even ugly.

that Fort Solis on the other hand, the other UE5 game, that does look very impressive so if that preforms similair Im much more inclined to give it a pass.
 
To achieve 60fps at 1440p you need to have a nine hundred dollar GPU. This has to be satire. My one year old 12gb 3080 cannot even post 60 fps. Pathetic, the devs should be ashamed of this sloppy work.
 
To achieve 60fps at 1440p you need to have a nine hundred dollar GPU. This has to be satire. My one year old 12gb 3080 cannot even post 60 fps. Pathetic, the devs should be ashamed of this sloppy work.
Hate to break it to you but your 3080 is about to hit 3 years old.
 
LOL.

That's it. That's my response.
 
Another meh looking UE5 game given the performance.... With so many game studios switching to this engine the future on the PC side looks pretty dire. I guess we will need 5090s to get half decent performance in this engine without Upscaling..... Assuming Nvidia can repeat the 4090s gains gen on gen.

I will give the devs credit for at least making it perform equally bad on both AMD/Nvidia hardware assuming there is anything they actually could do about it and it's not just this engine in general when using Nanite and Lumen.
 
To achieve 60fps at 1440p you need to have a nine hundred dollar GPU. This has to be satire. My one year old 12gb 3080 cannot even post 60 fps. Pathetic, the devs should be ashamed of this sloppy work.


Welll sometimes we all have to take a step back and look at the big picture.
Ok, so regardless of this game's looks, its up to the devs to decide what even entails the "high" settings, there is no rule on what high needs to be, they could have changed what is now low into high and assuming the game scales well, then suddenly now performance is great on "high" settings.

Hell Crysis (remaster) did this even in a funny way, its almost to expose how emotional people get around this topic, their lowest settings are called high and medium is called ultra and high is called enthusiast, so to preserve the "dignity" of pc gamers, you never run the game on "low"...technically.

And you can call me old but the sliders really dont do a lot between low and high, sure I can see some distance model detail and shadow etc but overall.. I mean idk if I was able to see if the game was running on low medium high or ultra by purely a screenshot on its own without direct comparison screenshots.

And that is kinda the silly concept that always revolves around pc gaming with tweakable settings.

If you could play with a 3080 at 1440p at 60fps with all settings on medium and some on high....with it looking nigh the same as ultra....would that really be such an issue?

Alex from DF does his optimized builds exactly like this, he checks what settings can gain performance with minimal hits to the visuals and yeah often you get something looking extremely similar but performing 20 fps better.

(that does again,sits loose from the fact that this game does not look great, good or even decent....)


EDIT, looking at the game comparison shots and the performance graphs with the 4080, we see about a 10fps boost from going to medium settings and 20 fps from going to low (which again, to my eyes, looks nigh identical and we have not even factored in using DLSS or so at that point)
So DLSS Quality or one step down, 1080p upscaled to 1440p, should already basically give you 60fps and then with perhaps a few settings tweaks that visually do not matter you can gain some more.
 
Last edited:
Incredibly bad optimization for Ampere GPUs. The RTX 3000 series is officially dead in this game.
 
i do not know why people are complaining about gpu requirements, if you have live long enough to see the pattern. you would see that many of the new games require more powerful gpu than the top card can provide. playing older games with new hardware is the way to go to get the optimal experience.
 
Hate to break it to you but your 3080 is about to hit 3 years old.
ok but what about 1080p 60fps for RTX 3080? Does the performance of the hardware drop that much in 3 years? If there is such a disgraceful situation, it is the fault of the developer.
 
Low and Ultra in this game have little differences in the screenshots. Slightly better lighting is all I see, maybe it's a bit different in motion(more pop-in?).

Like I said on the last performance analysis on a UE5 game, it's just like 4. Runs like garbage at release across the board in every game that isn't an Epic Games Studio's game.
 
ok but what about 1080p 60fps for RTX 3080? Does the performance of the hardware drop that much in 3 years? If there is such a disgraceful situation, it is the fault of the developer.

When the 3090 was first demoed on this engine it was struggling at 1080p there is no real surprise that a 3080 also struggles especially without DLSS.
 
When the 3090 was first demoed on this engine it was struggling at 1080p there is no real surprise that a 3080 also struggles especially without DLSS.
I would never buy a game with such bad fps.
According to the games released in 2020-2021 unfortunately, it does not work wonders, especially based on the hardware requirements.
I wonder how it will work on PS5.
 
I would never buy a game with such bad fps.
According to the games released in 2020-2021 unfortunately, it does not work wonders, especially based on the hardware requirements.
I wonder how it will work on PS5.

Performance looked pretty bad on Series X with FSRtrash I mean 2....
 
Performance looked pretty bad on Series X with FSRtrash I mean 2....

pls stop this unintelligent hate campaign comments and actually try it and see how it fairs.
Recent video of DF showed FSR2 does wonders for the re-release of Red Dead

I would never buy a game with such bad fps.
According to the games released in 2020-2021 unfortunately, it does not work wonders, especially based on the hardware requirements.
I wonder how it will work on PS5.

what fps would it have to have had?
 
They should have stuck with Frostbite engine. At least some devs there will know how to use it despite the original team engine build team not being part of EA anymore.
 
When you buy this game, you help to make a dev's life (the work life, not sure about life-life) better. Is there going to be a separate deep-dive into how nanite and lumen is used? Also, are there any anomalies where they don't work as well as expected?
 
pls stop this unintelligent hate campaign comments and actually try it and see how it fairs.
Recent video of DF showed FSR2 does wonders for the re-release of Red Dead

I've hated it in every game I've tried it in it almost ruins FF16 although it's so bad in that it's got to be FSR1

If you like it good for you though.
 
Last edited:
what fps would it have to have had?
I have 3070. And I play most of the 2020 21 games in 1080p at 100fps. Ok there is UE5 but at least with RTX 3080 it should give close to 80-90 FPS. At 1080p.
 
I've been playing with the integrated Performance Budget Tool, which is found in the graphics menu. The game assesses your graphics card's and processor's performance on first launch, presents the benchmark results as your total GPU and CPU "budgets", and (supposedly) sets the optimal detail level. Every graphics setting is assigned two values reflecting its impact on GPU and CPU performance. Here's a detailed breakdown of the costs from their minimum to maximum value:

GPU costCPU costGPU costCPU cost
Texture qualitylow3010ultra5010
Visual effects qualitylow1010ultra3010
Shadow qualitylow3010ultra5010
Post processing qualitylow2010ultra4030
Volumetric fog resolutionlow2010ultra6030
Global illumination qualitylow3020ultra9020
Reflection qualitylow400high600
Anisotropic filteringoff0016x100100
Ambient occlusion qualitylow1010ultra5030
Atmosphere qualitylow1010ultra5010
Cinematics depth of field qualitylow2010ultra6030
Foliage qualitylow1010ultra1010
Mesh qualitylow3010ultra9010
Cinematics motion blur qualitylow3010ultra5010
Particle qualitylow4010ultra6010
Shadow mesh qualitylow1020ultra7020
Shadow resolution qualitylow2020ultra2040
Subsurface scattering qualitylow2010ultra8010
total cost3801901020390

There are also two toggles -- Lights shafts and Local exposure -- without allotted numerical costs.

While the whole idea sounds very practical, the values chosen by the developers to represent each cost make one wonder. While global illumination and mesh quality certainly burden the GPU, why is anisotropic filtering supposed to be the most demanding setting of all? Also, the tool seems to miscalculate the total CPU budget. According to the developers (the resolutions below assume upscaling in quality mode):

dev.jpg


My 5800X3D was evaluated at 230, whereas the 13900K gets 290 -- both of which seem way off. The Performance Budget Tool could become really helpful eventually, but it needs further polishing. Also, the game would benefit greatly from textual descriptions of each setting, with the corresponding visual benefit/tradeoff. That could help less experienced gamers make more informed choices in quality vs. performance.

And just for reference, those are the total budgets assigned by the tool to various GPUs:

7900XTX1750
40901450
40801100
3080Ti800
4070 / 3080700
3070600
6700XT / 3060Ti / 3070 mobile500
6600 / 2070S400
3060 mobile350
2060300
Asus ROG Ally250
1650100
1050Ti80
Steam Deck60
 
Last edited:
Back
Top