• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Immortals of Aveum Benchmark Test and Performance Analysis

Immortals of Aveum:

Turns out your GPU isn't immortal and is already dead.
 
Right, but a 3080 is about the same as a 4070 and the main advantage the 4070 has besides being more energy efficient is frame generation. The 3080 is a powerful card that happens to not support newer software enhancements which I suspect is intentional, and it had all the performance that I personally needed but it's locked out by design, again my suspicion. I just upgraded from a 3080 to a 4090 (arguably the best 4000 series value besides maybe the 4070) because I saw the writing on the wall where if you don't have upscaling along with frame generation (fake frames), which I call the combo "fakescaling", you're not going to have a good time going forward. Games are being made in UE5 engine without the optimization that you find in fortnite which would have made the 3080 run great in Immortals of Upscaling, I mean Aveum. So the only way the 3080 being 3 years old is a problem is that game designers or publishers or whoever is to blame are leaning heavily on the latest nvidia software enhancements so they can do less optimization so they can get the game out the door as fast as possible for maximum profits. So you're not wrong, but saying the problem is the 3080 being 3 years old isn't entirely right either. The problem is that the 3080 is locked out by nvidia combined with the greed of whoever's responsible for Immortals of Upscaling, I mean Aveum, being unoptimized.

Until we have 8-10 major releases using this engine I think it's a little early to say it's down to optimization. So far everything performs badly in both major releases this/remnant 2 without upscaling and with Layers of Fear it looks so dated it's hard to judge but even then performance is only ok.
 
To achieve 60fps at 1440p you need to have a nine hundred dollar GPU. This has to be satire. My one year old 12gb 3080 cannot even post 60 fps. Pathetic, the devs should be ashamed of this sloppy work.
Your card is just ass lol you should of bought a 6800XT. Ain't nothing wrong with the game it's UE5 it's demanding as fuck just because across the stack Nvidia does bad y'all automatically assume it's an issue. Your cards are going to suck most of the time as time goes on this is CLEARLY evident have you seen all the new games? 3080 performs bad in everything right now compared to the 6800XT.

Was more talking about this game specifically with how a 6800 nearly matches a 3080 at 4k and beats it at 1440p and a 6900XT beats a 3090ti at 1440p and nearly matches it at 4k. Which you don't normally see in the majority of games especially 3080 vs 6800.
This guy has not been paying attention to the updates for the 7900XTX the 4080 trails behind it so much now the 4080 is a lot closer to the 7900XT than the 4080 is to the 7900XTX
 
The starting idea of Dlss,fsr,fg was to help the cards to run at 4k 144 fps.
Nowadays this all used to make a game playable at 1080 60 fps.

Also why someone release a game that fg is totally failure?
This game is dead on arrival.

Also some said this:
The ue5 engine will run better in the next hardware.

I will answer this:
I will wait the next hardware release to buy the game.
 
As expected.

More and more developers are ditching in-house tech and switching to an engine designed for virtual production, not gaming.
 
Screw the plebs. UE5 will do wonders for the Witcher remaster, including allowing for easy mods and crowd-sourcing everlasting content... yeah!
 
Your card is just ass lol you should of bought a 6800XT. Ain't nothing wrong with the game it's UE5 it's demanding as fuck just because across the stack Nvidia does bad y'all automatically assume it's an issue. Your cards are going to suck most of the time as time goes on this is CLEARLY evident have you seen all the new games? 3080 performs bad in everything right now compared to the 6800XT.
Then why does the 3080 get high frame rates in fortnite, a UE5 game? I'll tell you why, it's optimized and immortals is not. The 6800 xt is only coming out on top in the last few months because of the vram, but it's going down soon too because it's no faster than a 3080. Then you won't feel so superior anymore.
Edit: Just saw the announcement of fsr3 frame generation for all amd gpu's so it looks like the 6800xt might last a good while longer if fsr3 is actually good.
 
Last edited:
Immoral Crapteum would be more appropriate name.

That's pretty much all UE5 games sucking ass hard in performance. Seriously, how do release this unoptimised crap and it isn't "wow" eye candy either.

Next they'll ask for 36GB cards.
 
Immoral Crapteum would be more appropriate name.

That's pretty much all UE5 games sucking ass hard in performance. Seriously, how do release this unoptimised crap and it isn't "wow" eye candy either.

Next they'll ask for 36GB cards.
Believe or not, LOD and global illumination are barely recognizable to gamers. Build engine and gzDoom are still going strong in 2023 :)
 
That was easy to patch out.. running like 160 FPS now
Noice
ive noticed on some vids so far that it seems that it was not maxing out GPU usage on some situations, either CPU limited or game engine?
 
Noice
ive noticed on some vids so far that it seems that it was not maxing out GPU usage on some situations, either CPU limited or game engine?
Artificially capped at 120 FPS and seems to be rendering the world at some lower resolution, too
 
FSR3 coming "soon" to Immortals of Aveum and Forspoken.

1692961846070.png


1692961807197.png
 
... high frame rates in fortnite, a UE5 game?
Epic used matured UE5 features in Fortnite update, such as Lumen. I blame Nanite or its misuse.
 
Then why does the 3080 get high frame rates in fortnite, a UE5 game? I'll tell you why, it's optimized and immortals is not. The 6800 xt is only coming out on top in the last few months because of the vram, but it's going down soon too because it's no faster than a 3080. Then you won't feel so superior anymore.

Fortnite in DX12 with all effects enabled gets roughly 1/3 of DX11 performance. A 4090 can't even maintain 60 FPS in native 4K.

And that game looks like a cartoon, you can't even compare it to something like IoA.

Nobody in their right mind would sacrifice 2/3 of performance in a competitive game, to get some effects they won't even notice during intense gameplay.


In a singleplayer game, you can get away with lower performance. The problem is this game has no scalability. Settings barely do anything, upscaling is not impressive either. It's way too early for Lumen and Nanite to be used in real time.
 
Gotta get people ready so Nvidia can sell RTX 5090s somehow
 
Hate to break it to you but your 3080 is about to hit 3 years old.
One year old or (nearly) three years old, it's still just one generation old and it's been basically de-rated as the "entry level" GPU for 1080p/60 gaming. Honestly this is fairly insane and it's like asking people to upgrade their GPU each generation, if they want to keep playing at their target resolution. Another level of insanity is how they exactly plan to sell this game, considering how, going by the last Steam survey, the most popular GPUs, typically used for 1080p gaming, are the GTX 1650, RTX 3060 and (the truly immortal, apparently) GTX 1060.
 
Current gen consoles running it at 720p upscaled by FSR2 to 4K, even lower on series S at 436p

 
One year old or (nearly) three years old, it's still just one generation old and it's been basically de-rated as the "entry level" GPU for 1080p/60 gaming. Honestly this is fairly insane and it's like asking people to upgrade their GPU each generation, if they want to keep playing at their target resolution. Another level of insanity is how they exactly plan to sell this game, considering how, going by the last Steam survey, the most popular GPUs, typically used for 1080p gaming, are the GTX 1650, RTX 3060 and (the truly immortal, apparently) GTX 1060.
That's what I wanted to say.
 
Wow that seems pretty trashy


And seriously, game devs often do not test DLSS/FSR etc. They get given a slab of code and get told 'add this' and it works on the generic base engine but not with all their custom stuff they've added in since.
Look at how badly things like sharpess are often implemented, the defaults can max out to 100% or 0% in some games and need .ini or .cfg file tweaks to alter since they never implemented sliders.
 
Current gen consoles running it at 720p upscaled by FSR2 to 4K, even lower on series S at 436p

Yep, definitely time for a PS5 Pro, so it can upscale from 1080p instead of 720p.

And they supposedly want that to be an 8K machine, but FSR Ultra Performance would require 1440p rendering. Maybe they will introduce FSR Ultimate Performance that will scale by a factor of 4 on each axis. That would take a 1080p image straight into 8K!
 
One year old or (nearly) three years old, it's still just one generation old and it's been basically de-rated as the "entry level" GPU for 1080p/60 gaming. Honestly this is fairly insane and it's like asking people to upgrade their GPU each generation, if they want to keep playing at their target resolution. Another level of insanity is how they exactly plan to sell this game, considering how, going by the last Steam survey, the most popular GPUs, typically used for 1080p gaming, are the GTX 1650, RTX 3060 and (the truly immortal, apparently) GTX 1060.
Yep. If this is the future, my 3070 won't run anything in a satisfying way. What I do run now I do so at 1440p144. And now you're telling me it won't be good for even 1080p60? Hell no.
 
It was clear from the get-go this was a pushed title that should sell on screenshots and some weird form of shock and awe.

Well they definitely managed the shock and awe aspect, to the extent of me totally skipping the game. GPU hog, with nothing to show for it, kthxbai
 
Yep. If this is the future, my 3070 won't run anything in a satisfying way.
That's where DLSS without ray tracing will keep those GPU's running for a long, long time.

Games like this aren't going to be normal because sales will be terrible due to the low performance, devs will hopefully learn this isn't good enough. Sales on PC are higher the more PC's can run a title, while on console they need to look and run good for everyone as a simple yes/no answer people can find online
 
Back
Top