• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Assassin's Creed Mirage Performance Benchmark

99th percentile frame times are more important than average fps and there, the 7900 XT does fine against the 4070 Ti.

All is important - average framerate and 99th percentile frame times and 1% lows.
I guess "eats dust" is directed at the unacceptable average fps drop from 160 fps to as low as 135 fps, which is rather significant all else being equal!
 
Hey @W1zzard


You tested with version AMD: 23.9.3 WHQL while German magazine , they tested with AMD software 23.20.11.7

Can you do bench again ?

1696704279209.png
 
Hey @W1zzard


You tested with version AMD: 23.9.3 WHQL while German magazine , they tested with AMD software 23.20.11.7

Can you do bench again ?

View attachment 316593
In that case he would probably be better off waiting for an Nvidia driver too
 
Except recent UE5 games and Starfield I guess.... These cards are going to continue trading blows till the 4070ti runs out of vram will that be 2 years, 5 years who knows.
Both of them are known bloatware's, beta versions...that need beefy hardware and yet looks like a game from last year or the year before.....
 
This new driver puts the mighty RX 7900 XTX 24 GB on top of the stack :D

1696709707700.png

1696709727417.png


Definitely a new review is required.
 
So the game looks great and runs great. And it's more compact, probably didn't take 5+ years to develop.

I hope it's a success and it shows developers that games don't need uber next-gen graphics that require a 4090 just to be playable, or gigantic empty worlds with a 50-hour long main story. It's time to scale down both software and hardware.
 
that games don't need uber next-gen graphics that require a 4090 just to be playable

They must cancel the RTX culture which is exactly what the RTX 4090 was created for.
In real ray-traced environment, even RTX 4090 will eat the dust because the framerate will drop to unplayable 0 to 1 fps.
 
Both of them are known bloatware's, beta versions...that need beefy hardware and yet looks like a game from last year or the year before.....

This looks like a console game from 4-5 years ago so not sure it's the best example either.
 
Different benchmark scene

It still blows me away that gamers think reviewers bench the exact same scene or that games performed exaclty the same in every part of the game.
 
Seems in recent games the 4070ti is edging ahead of the 7900xt while it was 10% slower on launch. Is it drivers or what?

It's also very strange the 4070ti is competently beating the 3090ti at HD & UHD, and still ahead on 4K- something it never does. Is Nvidia hobbling Ampere?

@ARF link your source. Your single card results look ultra cherry picked.
 
Hey @W1zzard


You tested with version AMD: 23.9.3 WHQL while German magazine , they tested with AMD software 23.20.11.7

Can you do bench again ?

View attachment 316593

an 8 f[ps gain over old driver isn't too bad, unless i am reading it wrong. tpu 1080p review is only 8 fps slower here
 
I honestly haven't ever been less interested in a game - it looks identical to the deserts of AC origin (which is 6 years old!!!!), except without the egyptian theme, which was what made that game interesting.

If ubisoft doesn't step it up big time with the next game, then this series will soon go the way of the dodo...

And doesn't look better than Crysis 3, 2013 title.
Ten years later, Crysis 3 is still the benchmark for state-of-the-art graphics.

View attachment 316483

Crysis 3 looked amazing in 2013, no doubt about it... but plenty of games since then has looked substantially better. Battlefront from 2015 already did.

But this AC title certainly doesn't look amazing... it looks what it is - dated.
 
Last edited:
Except recent UE5 games and Starfield I guess.... These cards are going to continue trading blows till the 4070ti runs out of vram will that be 2 years, 5 years who knows.
And by that time 7900XT won't be able to max games anyway, because GPU is too weak.
 
And by that time 7900XT won't be able to max games anyway, because GPU is too weak.

Maybe I'm still not going to give Nvidia a pass charging 800+ for a 12GB card just becuase they want you to spend 1100+ on a 16GB one.
 
Maybe I'm still not going to give Nvidia a pass charging 800+ for a 12GB card just becuase they want you to spend 1100+ on a 16GB one.
I know two people that bought 4070 Ti and they are more than satisfied + they paid more like 650-700 dollars, not 800+

Both of them run 1440p which seem perfect for this card, performing like a 3090 Ti here (which was 1999 dollars just 9-10 months before 4070 Ti came out), just with half the power usage and little to no heat

12GB is more than enough VRAM for 1440p and even 4K/UHD and I doubt this will change before next console generation (not refreshes)

You won't be able to max out games in some years with a 4070 Ti in 4K anyway, hence lowering VRAM usage

Even my 4090 will be considered mediocre by 2025-2026 probably, regardless of it having 24GB VRAM.



4070 Ti 12GB beats 3090 24GB in most new games coming out, even in 4K - 2-3 years ago 3090 was considered a 4K beast with "VRAM for years" ... Futureproofing is pointless

In this game, even 3090 Ti is beat by 4070 Ti in 4K on max settings, in both average and minimums, twice the VRAM, zero difference

Ada have better memory compression and cache hit/miss feature (hence ten times the cache vs ampere), all this lowers actual memory usage

Yep by 2027-2028 when RTX 6000 series probably comes out, 12GB might be too little to max out demanding games in 4K. Yet todays GPUs will be horribly slow regardless of having enough VRAM, including 4090. This is why futureproofing is pointless. If you want proper performance and optimization (from both game devs and AMD/Nvidia) you stay on the newer architectures.

I can hit 18.5GB usage on my 4090 in Cyberpunk fully maxed out, FOV 100 and Path Tracing + DLSS 3.5 + FG, yet 4070 Ti can run the same settings without running out of VRAM.

More VRAM = More Allocation. Tons of game engines allocations a given percentage of VRAM. You cant really use software readings anyway, you simply look at minimum fps, because this will drop fast when VRAM starved and from what I see, zero games needs more than 12GB at 4K yet


Lets see if Nvidia pushes Neural Texture Compression to developers. They confirmed that you don't need a huge RAM buffer to deliver best textures. Maybe game developers should get smarter instead of more lazy.

PCs get faster and faster, yet more and more games are not optimized well
 
Last edited:
I know two people that bought 4070 Ti and they are more than satisfied + they paid more like 650-700 dollars, not 800+

Both of them run 1440p which seem perfect for this card, performing like a 3090 Ti here (which was 1999 dollars just 9-10 months before 4070 Ti came out), just with half the power usage and little to no heat

12GB is more than enough VRAM for 1440p and even 4K/UHD and I doubt this will change before next console generation (not refreshes)

You won't be able to max out games in some years with a 4070 Ti in 4K anyway, hence lowering VRAM usage

Even my 4090 will be considered mediocre by 2025-2026 probably, regardless of it having 24GB VRAM.



4070 Ti 12GB beats 3090 24GB in most new games coming out, even in 4K - 2-3 years ago 3090 was considered a 4K beast with "VRAM for years" ... Futureproofing is pointless

In this game, even 3090 Ti is beat by 4070 Ti in 4K on max settings, in both average and minimums, twice the VRAM, zero difference

Ada have better memory compression and cache hit/miss feature (hence ten times the cache vs ampere), all this lowers actual memory usage

Yep by 2027-2028 when RTX 6000 series probably comes out, 12GB might be too little to max out demanding games in 4K. Yet todays GPUs will be horribly slow regardless of having enough VRAM, including 4090. This is why futureproofing is pointless. If you want proper performance and optimization (from both game devs and AMD/Nvidia) you stay on the newer architectures.

I can hit 18.5GB usage on my 4090 in Cyberpunk fully maxed out, FOV 100 and Path Tracing + DLSS 3.5 + FG, yet 4070 Ti can run the same settings without running out of VRAM.

More VRAM = More Allocation. Tons of game engines allocations a given percentage of VRAM. You cant really use software readings anyway, you simply look at minimum fps, because this will drop fast when VRAM starved and from what I see, zero games needs more than 12GB at 4K yet


Lets see if Nvidia pushes Neural Texture Compression to developers. They confirmed that you don't need a huge RAM buffer to deliver best textures. Maybe game developers should get smarter instead of more lazy.

PCs get faster and faster, yet more and more games are not optimized well

I don't think either of these companies need defending they both are asking a lot of $$$ for the privilege of using their products. They both should be offering products at every price point without compromise even the 4090 was cut down more than usual more like a 80ti like product same with amd FSR2 should be way better by now and they really need to improve RT to be more in parity with Nvidia.

We should honestly expect more and defending either company doesn't make a ton of sense to me.

At the end of the day people have to decide what is worth their money if that is a 4070ti good for them.
 
I know two people that bought 4070 Ti and they are more than satisfied + they paid more like 650-700 dollars, not 800+

Both of them run 1440p which seem perfect for this card, performing like a 3090 Ti here (which was 1999 dollars just 9-10 months before 4070 Ti came out), just with half the power usage and little to no heat

12GB is more than enough VRAM for 1440p and even 4K/UHD and I doubt this will change before next console generation (not refreshes)

You won't be able to max out games in some years with a 4070 Ti in 4K anyway, hence lowering VRAM usage

Even my 4090 will be considered mediocre by 2025-2026 probably, regardless of it having 24GB VRAM.

Don't worry. Your 4090 will be the top dog even then. We all know nvidia is about to delay 5 series several more years.

I disagree with you. 12 GB of VRAM is very insufficient, and games today do need up to 14-15 GB of VRAM to run well.
I will take your example as artificially fabricated in order to fit in your agenda.

When you pay 650, or 700, or 800 money, you definitely want more VRAM, regardless of the fact that the greedy nvidia wants to sell you less.

1696873019543.png


1696872981935.png

 
this review has to be done again will new drivers

here is 7800xt 1440p ultra settings


as for no ht support for intel cpus ,image that this game is intel sponsored :D
 
Can Ubisoft stop remaking assass-games and make a splintercell please, damn them!
WAB, Youtube reviewer says this game is shiet!
 
Don't worry. Your 4090 will be the top dog even then. We all know nvidia is about to delay 5 series several more years.

I disagree with you. 12 GB of VRAM is very insufficient, and games today do need up to 14-15 GB of VRAM to run well.
I will take your example as artificially fabricated in order to fit in your agenda.

When you pay 650, or 700, or 800 money, you definitely want more VRAM, regardless of the fact that the greedy nvidia wants to sell you less.

View attachment 316857

View attachment 316856
The problem is that neither hogwarts or cyberpunk can be played with those settings on an amd card, so sure you might want more vram on the 4070ti but the 7900xt that has more vram doesn't make those games any more playable. That is the MAJOR point that people keep missing when they bring up the vram argument. Yes, more vram on nvidia cards would have been nice, but an AMD card is not an option cause most of the games that require tons of vram have RT (and FG, which also uses more vram), and AMD is majorly inferior in those instances.
 
On a different note; this may be the first AC game in a very long time (or ever) that has been well optimized.
Engine wise it's 100% Valhalla, and as good/badly optimized as that was. It's just that hardware had some time to catch up since then.
 
Back
Top