• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Immortals of Aveum Benchmark Test and Performance Analysis

ok but what about 1080p 60fps for RTX 3080? Does the performance of the hardware drop that much in 3 years? If there is such a disgraceful situation, it is the fault of the developer.
lol 59.1 FPS not close enough for you? Dropping one setting, pretty much any setting, would do it for you. As would overclocking.
 
I've been playing with the integrated Performance Budget Tool, which is found in the graphics menu. The game assesses your graphics card's and processor's performance on first launch, presents the benchmark results as your total GPU and CPU "budgets", and sets the detail options accordingly. Every graphics setting is assigned two values reflecting its impact on GPU and CPU performance. Here's a detailed breakdown of the costs from their minimum to maximum value:

GPU costCPU costGPU costCPU cost
Texture qualitylow3010ultra5010
Visual effects qualitylow1010ultra3010
Shadow qualitylow3010ultra5010
Post processing qualitylow2010ultra4030
Volumetric fog resolutionlow2010ultra6030
Global illumination qualitylow3020ultra9020
Reflection qualitylow400high600
Anisotropic filteringoff0016x100100
Ambient occlusion qualitylow1010ultra5030
Atmosphere qualitylow1010ultra5010
Cinematics depth of field qualitylow2010ultra6030
Foliage qualitylow1010ultra1010
Mesh qualitylow3010ultra9010
Cinematics motion blur qualitylow3010ultra5010
Particle qualitylow4010ultra6010
Shadow mesh qualitylow1020ultra7020
Shadow resolution qualitylow2020ultra2040
Subsurface scattering qualitylow2010ultra8010
total cost3801901020390

There are also two toggles -- Lights shafts and Local exposure -- without allotted numerical costs.

While the whole idea sounds very practical in theory, the values chosen by the developers to represent each cost make one wonder. While global illumination and mesh quality certainly burden the GPU, why is anisotropic filtering supposed to be the most demanding setting? Also, the tool seems to miscalculate the total CPU budget. In the developers' own words:

View attachment 310398

My 5800X3D was evaluated at 230, whereas the 13900K gets 290. The Performance Budget Tool could become really helpful eventually, but it needs further polishing. Also, the game would benefit greatly from textual descriptions of each setting, with the corresponding visual benefit/tradeoff. That could help less experienced gamers make more informed choices in quality vs. performance.

And just for reference, those are the total budgets assigned by the tool to various GPUs:

7900XTX 1750
4090 1450
4080 1100
3080Ti 800
4070 700
3080 700
3060Ti 500
2060 300
1650 100
Steam Deck60

I was watching a video about that earlier it seems whatever system they came up with to generate numbers is full of shite.
 
They should have stuck with Frostbite engine. At least some devs there will know how to use it
Frostbite is a huge POS that's extremely complicated to work with. UE on the other hand makes it REALLY easy to produce a game world and make it work .. with UE5 you can now play with a lot of things in the editor in real-time, while previously you had to bake lighting just to get a preview, and on other engines most of these things aren't even possible. there is a reason why everybody is using UE
 
lol 59.1 FPS not close enough for you? Dropping one setting, pretty much any setting, would do it for you. As would overclocking.
IMHO this is kinda missing the point - the game just doesn't look good, lol. There are games that both look better on a technical level and run better too so surely this is a fail. Who is to blame is another topic altogether but the fact is performance here is abysmal.
 
Frostbite is a huge POS that's extremely complicated to work with. UE on the other hand makes it REALLY easy to produce a game world and make it work .. with UE5 you can now play with a lot of things in the editor in real-time, while previously you had to bake lighting just to get a preview, and on other engines most of these things aren't even possible. there is a reason why everybody is using UE

I wouldn't be surprised if in the next couple years at least 30-40% of major releases use this engine a lot of major studios that used engines I liked have made the switch which I think is unfortunate. Hopefully I'm just being pessimistic and it all works out but the engine hasn't gotten off to a good start for consumers regardless of how much easier it makes it for the developer.

Remnant 2 did seem to sell well regardless of it's performance though.
 
I wouldn't be surprised if in the next couple years at least 30-40% of major releases use this engine a lot of major studios that used engines I liked have made the switch which I think is unfortunate. Hopefully I'm just being pessimistic and it all works out but the engine hasn't gotten off to a good start for consumers regardless of how much easier it makes it for the developer.
Agree 100%
 
Frostbite is a huge POS that's extremely complicated to work with. UE on the other hand makes it REALLY easy to produce a game world and make it work .. with UE5 you can now play with a lot of things in the editor in real-time, while previously you had to bake lighting just to get a preview, and on other engines most of these things aren't even possible. there is a reason why everybody is using UE

yeah well there is a reason everyone here drives a Volkswagen as well, and that reason is that...everyone is driving a volkswagen, so there are a lot of parts available for cheap and expertise is also everywhere to find, good luck fixing a Lexus at your local dealership.
That does not mean a volkswagen is a better car then a lexus though....
 
yeah well there is a reason everyone here drivers a Volkswagen as well, and that reason is that...everyone is driving a volkswagen, so there are a lot of parts available for cheap and expertese is also everywhere to find, good luck fixing a Lexus at your local dealership.
You are making a good point, too, there's A LOT of talent out there that knows how to work with UE.

The underlying reason is that it's actually nice to work with UE. Like 2 years ago I wanted to write my own benchmark for in-house testing at customizable workloads .. with very little experience .. UE was by far the best learning experience, and it's free, and they give you all the source code
 
Frostbite is a huge POS that's extremely complicated to work with. UE on the other hand makes it REALLY easy to produce a game world and make it work .. with UE5 you can now play with a lot of things in the editor in real-time, while previously you had to bake lighting just to get a preview, and on other engines most of these things aren't even possible. there is a reason why everybody is using UE

In that case why is the game so badly optimised? Unless its just teething issues with UE-5 being more or less new? But yeah, I know about Frostbite being complicated and hard to work with because nobody really knows how to use it properly, but EA was trying to get all the studios on board with it at one point.

Hindsight being 20/20, they should have tried harder to keep the original team that built the engine around for longer so they could maybe train more people but I guess the staff turnover has been pretty catastrophic the last few years.
 
Last edited:
In that case why is the game so badly optimised? Unless its just teething issues with UE-5 being more or less new? But yeah, I know about Frostbite being complicated and hard to work with because nobody really knows how to use it properly, but EA was trying to get all the studios on board with it at one point.

Hindsight being 20/20, they should have tried harder to keep the original team that built the engine around for longer so they could maybe train more people but I guess the staff turnover has been pretty high catastrophic the last few years.
In general, heavily-modularized software is harder to optimize -- so many moving parts, and keeping maximum compatibility/flexibility eats up the potential. UE being design-heavy (think no-code, plugins, extensions), rather than code-heavy is eating into the performance budget.

Also nanite replaces old static LOD meshes, which could be hand-tuned for best loading and residential load on the GPU. Nanite is an additional load on the CPU to dynamically "generate" these LOD's. Lumen is basically software global illumination which is really heavy, which used to require several tricks to run fast.
 
I've been playing with the integrated Performance Budget Tool, which is found in the graphics menu. The game assesses your graphics card's and processor's performance on first launch, presents the benchmark results as your total GPU and CPU "budgets", and (supposedly) sets the optimal detail level. Every graphics setting is assigned two values reflecting its impact on GPU and CPU performance. Here's a detailed breakdown of the costs from their minimum to maximum value:

GPU costCPU costGPU costCPU cost
Texture qualitylow3010ultra5010
Visual effects qualitylow1010ultra3010
Shadow qualitylow3010ultra5010
Post processing qualitylow2010ultra4030
Volumetric fog resolutionlow2010ultra6030
Global illumination qualitylow3020ultra9020
Reflection qualitylow400high600
Anisotropic filteringoff0016x100100
Ambient occlusion qualitylow1010ultra5030
Atmosphere qualitylow1010ultra5010
Cinematics depth of field qualitylow2010ultra6030
Foliage qualitylow1010ultra1010
Mesh qualitylow3010ultra9010
Cinematics motion blur qualitylow3010ultra5010
Particle qualitylow4010ultra6010
Shadow mesh qualitylow1020ultra7020
Shadow resolution qualitylow2020ultra2040
Subsurface scattering qualitylow2010ultra8010
total cost3801901020390

There are also two toggles -- Lights shafts and Local exposure -- without allotted numerical costs.

While the whole idea sounds very practical, the values chosen by the developers to represent each cost make one wonder. While global illumination and mesh quality certainly burden the GPU, why is anisotropic filtering supposed to be the most demanding setting of all? Also, the tool seems to miscalculate the total CPU budget. According to the developers (the resolutions below assume upscaling in quality mode):

View attachment 310398

My 5800X3D was evaluated at 230, whereas the 13900K gets 290 -- both of which seem way off. The Performance Budget Tool could become really helpful eventually, but it needs further polishing. Also, the game would benefit greatly from textual descriptions of each setting, with the corresponding visual benefit/tradeoff. That could help less experienced gamers make more informed choices in quality vs. performance.

And just for reference, those are the total budgets assigned by the tool to various GPUs:

7900XTX1750
40901450
40801100
3080Ti800
4070700
3080700
3060Ti500
2060300
1650100
Steam Deck60

Haha wow, I can't even. This budgeting tool is hilarious. The 7900 XTX outweighing the 4090 by that much just proves it.

They can keep their game, the engine, and maybe partner with AMD to shift some extra copies for Red Team fans to benchmark and claim their GPU wins at something on the wccftech comment section. It's about all it's good for unless this game gets major patching to the point all Nvidia GPUs top to bottom get their budget score doubled at the very least

In general, heavily-modularized software is harder to optimize -- so many moving parts, and keeping maximum compatibility/flexibility eats up the potential. UE being design-heavy (think no-code, plugins, extensions), rather than code-heavy is eating into the performance budget.

Also nanite replaces old static LOD meshes, which could be hand-tuned for best loading and residential load on the GPU. Nanite is an additional load on the CPU to dynamically "generate" these LOD's. Lumen is basically software global illumination which is really heavy, which used to require several tricks to run fast.

I can understand the tech behind it, but the implementation is subpar and hilariously biased towards one type of hardware architecture. That's no way to ship a game.

There's just no possible way that a game that struggles like this on a rig of W1zz's caliber that gets a pass. No matter the reason.

I'm really interested in seeing if the devs will chime in and explain the situation or if this is just ashes of the benchmark 2.0.
 
Haha wow, I can't even. This budgeting tool is hilarious. The 7900 XTX outweighing the 4090 by that much just proves it.

They can keep their game, the engine, and maybe partner with AMD to shift some extra copies for Red Team fans to benchmark and claim their GPU wins at something on the wccftech comment section. It's about all it's good for unless this game gets major patching to the point all Nvidia GPUs top to bottom get their budget score doubled at the very least

This has more to do with the engine than the Developer I think.... I think in an interview they said some crazy stuff about using AI to make it more accurate but I could be wrong lol...
 
This has more to do with the engine than the Developer I think.... I think in an interview they said some crazy stuff about using AI to make it more accurate but I could be wrong lol...

Sure, AMD has the XDNA instructions on RDNA 3. Nvidia has had an entire dedicated AI engine available to developers since Turing going back 5 years?
 
Yeah, this isn't exactly UE5, it's the "modern" game development yet again. I am not going to buy this even after I upgrade my GPU.
I know screenshots are never true representation of a game, but common, where is the WOW factor in one of the first "groundbreakers"? It really does look dated as W1zzard said, yet eats resources like there is no tomorrow.
 
Yeah, this isn't exactly UE5, it's the "modern" game development yet again. I am not going to buy this even after I upgrade my GPU.
I know screenshots are never true representation of a game, but common, where is the WOW factor in one of the first "groundbreakers"? It really does look dated as W1zzard said, yet eats resources like there is no tomorrow.

Not to mention how it looks like the dev team had the time to develop and test it against one architecture and they picked just the one that's outsold 8 to 1 by its competition

The simple fact that this is attributing a much higher performance grade to the 7900 XTX tells you all you need to know.

I get UE5 being AMD friendly, Remnant II seemed a little AMD biased but nothing too outrageous but here? This makes no sense whatsoever.
 
I've hated it in every game I've tried it in it almost ruins FF16 although it's so bad in that it's got to be FSR1

If you like it good for you though.
I think you have to look back at the original intend of the technology. FSR 1 was release to make it simple to apply such that any GPUs, including older ones, have some sort of upscaling tech so that you don't have to sacrifice on resolution. Sure, it is known to be inferior to DLSS, but I don't see Nvidia doing a GPU agnostic solution like Intel and AMD.
 
I think you have to look back at the original intend of the technology. FSR 1 was release to make it simple to apply such that any GPUs, including older ones, have some sort of upscaling tech so that you don't have to sacrifice on resolution. Sure, it is known to be inferior to DLSS, but I don't see Nvidia doing a GPU agnostic solution like Intel and AMD.

The question that needs to be asked is whether AMD would make FSR completely hardware agnostic of the following conditions were met:

1. They had a market leader position
2. Their hardware had matrix multiplication capabilities
3. By leveraging their own hardware's feature set, they could achieve a better result

Intel opted to make XeSS usable on more than just Arc by enabling an alternative, less efficient code path. Nvidia could probably do the same if it really wanted to, except that the three conditions above are currently met by their product line, while Intel's probably been selling Arc at cost or even subsidized to get by this debut generation.
 
from what i see so far, all UE5 seems to have this grainy feel to the textures and not just because of the film grain, i thought it was lumen at first with the denoising but even remnant 2 seems to have it as it doesn't use lumen
 
from what i see so far, all UE5 seems to have this grainy feel to the textures and not just because of the film grain, i thought it was lumen at first with the denoising but even remnant 2 seems to have it as it doesn't use lumen

Yeah, something just looks off to me as well maybe it's the way textures are compressed in this engine. The one thing epic doesn't seem to have solved is texture quality in general though it's pretty bad in some spots in both this and remnant not sure how much that is the engine or the developer though.
 
Given the 3 comparative images, the games far from being ugly in "low" preset ... (compared to max settings)
but still, the numbers are pretty low :/
 
Yeah, this isn't exactly UE5, it's the "modern" game development yet again. I am not going to buy this even after I upgrade my GPU.
I know screenshots are never true representation of a game, but common, where is the WOW factor in one of the first "groundbreakers"? It really does look dated as W1zzard said, yet eats resources like there is no tomorrow.
Yup the "fault" of UE5 is that it is so easy to use.
This even less competent devs like this can make games that look good on trailers and screenshots.
Once they got the money most people don't bother refunding their games even if it runs like garbage.
At most they complain on social media and forgets about it after a while.
So in the end the incentive is to skimp out on optimization which is time consuming and expensive in man-hours.
 
Last edited:
Great to see RDNA doing well in this game. I would've thought the VRAM requirements would be much worse after seeing the recommended specs.
 
Elden Ring maker comes out with armored core vi today, @W1zzard i hope we get performance review for this one, it scored really well on pcgamer review yesterday. :rockout:
 
Not to mention how it looks like the dev team had the time to develop and test it against one architecture and they picked just the one that's outsold 8 to 1 by its competition
That's not really true. I know this is a PC forum, but if you're talking about "architecture" and outselling (in gaming), you'd need to count the consoles too
 
Elden Ring maker comes out with armored core vi today, @W1zzard i hope we get performance review for this one, it scored really well on pcgamer review yesterday. :rockout:
Yeah, definitely. Any idea if it's 60 FPS locked again?
 
Back
Top