• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Assassin's Creed Mirage Performance Benchmark

I wonder if the game works any better on x3d cache cpu's? hmm might have to look at a few other reviews next week and see if x3d cpu's benefit this game any.
Using a 7950x3d and a 4090 on DLSS Ultra Perf for CPU bottleneck here is the game first pic 3dcores, 2nd pic standard cores. Im using Process Lasso to swap between them. image[1].pngimage[1].png
 
The comment be much better with more context.
Denuvo puts cooldowns on gamestart when you change your motherboard. That said, seems like mine is over, and the 7800X3D is doing fair, producing an ever so slightly higher possible framerate than the Core i9 13900K (243 vs 249 FPS avg using a 4090 at 1080p low). ill continue testing more AM5 SKUs now. This isn't where I can fully display reports and aggregates. If W1zz wants, he could surely test that and deploy his findings.
 
I will say, for all their flaws, many games these days look pretty darn good even on "low" quality settings.

you think so? to me this looks like a game from 5 years ago if not more and it basically does not scale between ultra and low which sucks for getting it to run well.
 
Last edited:
you think so? to me this looks like a game from 5 years ago if not more and it basically does not scale between ultra and low which sucks for getting to to run well.
Maybe so, but think about what a big difference there was between high and low settings in games 10-15 years ago. Low was always a washed out, low-poly show, for those that didn't have a beefy GPU.

Seems in recent games the 4070ti is edging ahead of the 7900xt while it was 10% slower on launch. Is it drivers or what?
Must be that limited VRAM that makes the 4070Ti totally not futureproof, I guess ;)
 
Must be that limited VRAM that makes the 4070Ti totally not futureproof, I guess ;)

Drivers..

RX 7900 XT 20GB - AC Valhalla: 1920x1080: 160 fps; 2560x1440: 127 fps; 3840x2160: 78 fps..
RTX 4070 Ti 12GB - AC Valhalla: 1920x1080: 146 fps; 2560x1440: 113 fps; 3840x2160: 70 fps..

RX 7900 XT 20GB - AC Mirage: 1920x1080: 135 fps; 2560x1440: 111 fps; 3840x2160: 71 fps..
RTX 4070 Ti 12GB - AC Mirage: 1920x1080: 141 fps; 2560x1440: 115 fps; 3840x2160: 72 fps..
 
Drivers..

RX 7900 XT 20GB - AC Valhalla: 1920x1080: 160 fps; 2560x1440: 127 fps; 3840x2160: 78 fps..
RTX 4070 Ti 12GB - AC Valhalla: 1920x1080: 146 fps; 2560x1440: 113 fps; 3840x2160: 70 fps..

RX 7900 XT 20GB - AC Mirage: 1920x1080: 135 fps; 2560x1440: 111 fps; 3840x2160: 71 fps..
RTX 4070 Ti 12GB - AC Mirage: 1920x1080: 141 fps; 2560x1440: 115 fps; 3840x2160: 72 fps..
Valhalla was amd sponsored - the 7900xt was beating the 4070ti handily. AC mirage is the exact same engine as valhalla but isn't amd sponsored, boom, the 7900xt eats dust. Makes you wonder :roll:
 
Valhalla loved REBAR, try forcing REBAR on with Nvidia Profile Inspector, should give a healthy boost to Nvidia cards.
 
Valhalla was amd sponsored - the 7900xt was beating the 4070ti handily. AC mirage is the exact same engine as valhalla but isn't amd sponsored, boom, the 7900xt eats dust. Makes you wonder :roll:
99th percentile frame times are more important than average fps and there, the 7900 XT does fine against the 4070 Ti.

1696646193891.png
 
Eh, I kind of hoped they would retrofit dx12 level raytraced reflections like they did for far cry 6 dunia engine, but anvil got no love, though I do notice screen space reflections not breaking down at screen edges
 
No it doesn't

min-fps-2560-1440.png
What are you on about? Cards slot differently based on res, whats new? There is nothing odd about any of these results, unless you are desperately trying to find basis for some assumption that ignites a nice little camp feud again. Im not seeing one here...

This just shows the 7900XT excels at 4K in this situation and the 4070ti gets relatively better minimums at 1440p. The end
 
What are you on about? Cards slot differently based on res, whats new? There is nothing odd about any of these results, unless you are desperately trying to find basis for some assumption that ignites a nice little camp feud again. Im not seeing one here...

This just shows the 7900XT excels at 4K in this situation and the 4070ti gets relatively better minimums at 1440p. The end
I said the 4070ti is doing well in this game, his reply was kinda " no it isn't", which is just false. In general it is the slower card in raster, so the fact that it gets ahead in averages means something, saying it has worse 1% lows at 4k doesn't change the fact that it creeped closer to the 7900xt in performance.


Extra impressive cause AC mirage shares the same engine with valhalla and odyssey, and in those games AMD was doing way better than normally. Maybe new drivers have boosted the 4070ti a lot compared to review versions.
 
The conclusion comments about the 4060 Ti and its cache use are interesting. I've had to undo my overclocks for this game to stop it crashing. I now wonder if it's related, or evidence to support that theory. On a different note; this may be the first AC game in a very long time (or ever) that has been well optimized.
 
Maybe so, but think about what a big difference there was between high and low settings in games 10-15 years ago. Low was always a washed out, low-poly show, for those that didn't have a beefy GPU.

sure but if that meant it was actually playable after that...then the entire point of having tweakable settings was accomplished.
I remember BF3 I believe, there was very little difference between high and low and as a result you needed a beefy up to date pc to run it...even on low, it did not scale well.
 
Valhalla was amd sponsored - the 7900xt was beating the 4070ti handily. AC mirage is the exact same engine as valhalla but isn't amd sponsored, boom, the 7900xt eats dust. Makes you wonder :roll:
With the recent drivers, AMD GPUs perform a little better compared to their competitors.
 
With the recent drivers, AMD GPUs perform a little better compared to their competitors.
That doesnt seem to be the case, the exact opposite. In the original 7900xt review last year the 7900x was 9% faster than the 4070ti in 1440p. In the 7800xt review a year later the difference is 6.5%. Nvidia fine wine.
 
  • Like
Reactions: ARF
That doesnt seem to be the case, the exact opposite. In the original 7900xt review last year the 7900x was 9% faster than the 4070ti in 1440p. In the 7800xt review a year later the difference is 6.5%. Nvidia fine wine.
I'm talking specifically about the game in this review. Recent drivers have improved performance in this title.

I don't know if the situation you mentioned is valid, anyone can manipulate it to make one or the other side look better, since performance is not uniform between different games.

Therefore, the ideal is to see the performance in a set of games that you are interested in instead of just looking at the general average.
 
That doesnt seem to be the case, the exact opposite. In the original 7900xt review last year the 7900x was 9% faster than the 4070ti in 1440p. In the 7800xt review a year later the difference is 6.5%. Nvidia fine wine.

I do believe that AMD needs to work harder on the drivers. There is so much wrong in them - low performance, high power consumption...

Valhalla was amd sponsored - the 7900xt was beating the 4070ti handily. AC mirage is the exact same engine as valhalla but isn't amd sponsored, boom, the 7900xt eats dust. Makes you wonder :roll:

This is both unfair, and maybe illegal.
I mean since when does money mean higher performance?
This screams for a class-action lawsuit against the game engine creaters to stop the shenanigans. The users lose from this.
 
This is both unfair, and maybe illegal.
I mean since when does money mean higher performance?
This screams for a class-action lawsuit against the game engine creaters to stop the shenanigans. The users lose from this.
It is pretty common though, amd sponsored games perform much better on amd gpus than other cards. Dunno why, maybe coincidence.
 
I said the 4070ti is doing well in this game, his reply was kinda " no it isn't", which is just false. In general it is the slower card in raster, so the fact that it gets ahead in averages means something, saying it has worse 1% lows at 4k doesn't change the fact that it creeped closer to the 7900xt in performance.


Extra impressive cause AC mirage shares the same engine with valhalla and odyssey, and in those games AMD was doing way better than normally. Maybe new drivers have boosted the 4070ti a lot compared to review versions.
Please don't misrepresent my argument. I didn't say that the 4070 Ti wasn't doing well. I said
99th percentile frame times are more important than average fps and there, the 7900 XT does fine against the 4070 Ti.

This was in response to your post claiming (emphasis added by me)
AC mirage is the exact same engine as valhalla but isn't amd sponsored, boom, the 7900xt eats dust. Makes you wonder

I'm not the one exaggerating a miniscule difference. Now, in general, the 7900XT is faster than the 4070 Ti in rasterized scenes. This is a good showing for the 4070 Ti, because it matches the 7900 XT at 1440p in 99th percentile frametimes while beating it in average fps.
 
Back
Top