• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Assassin's Creed Shadows Performance Benchmark

Maybe it's just because I'm not looking at the quality comparisons blown up on my screen, but the difference between 'Low' and 'Max' looks to me like it amounts to a slight difference in detail, some AA, and something I could achieve by fiddling with a post-processing overlay for a few minutes. The graphics don't exactly blow me away, and it's asking me to tank my frames for what, slightly more realistic and sharp lighting/shadows?

I just don't see the point.
 
Welp, the price of admission to the elite 60 fps club is high for this one :pimp:
And no, the graphics do not justify the absurd requirements in any way.

The 50 series are doing surprisingly poor, even with RT and/or upscaling. Or perhaps it's the Radeon cards doing so well. Good to see the 9070XT trading blows with the previous AMD flagship, and consistently outperforming current offerings from Nvidia.
 
Last edited:
sure you do, at 1080p
:D
I didn't get a 4090 and a 9800X3D to play at anything other than 4K ultra settings at as many fps as possible... Always 60+ :peace:
 
wow.... the 9070xt beats or on par with the 5080 in all 3 resolution loses to it when enabling rt... by 3fps... that is pretty amazing
Once again, people looking at the wrong window. .1fps.

.1......fps. Once again, overclock that boy for that 1.1fps and be joyous.

I keep telling people...This is what to start expecting. This isn't going to be abnormal, and is in-fact the new normal. And 9070 XT is doing exactly as it should.
The difference between RT and High settings is... 33% less fps...
Just that... Nothing else...
Disappointment...
That's the way it should be. RT should scale like resolution. 1080pRT should be 1440p raster. Just like in the future 1080pPT should be 1440pRT should be 4k raster.
Just like 1080pRT should be ~1440p up-scaled RT. It should just all flow in a seamless fashion and predictable manner.

1742575169272.png

1742575554393.png


Lol @ 5080 can't keep 60fps even w/o RT @ 1440p native. Am I surprised? No. Are some people? Probably.
4080(s) falling further under 60. Am I surprised? No. Are some people? Probably. Pretty soon no overclock will save it, and 9070 xt...still 60. Hammering that point home until people realize it's true.
5070Ti barely in the VRR window at 1440p native. Am I surprised? No. Are some people? Probably. Expect this trend (especially wrt 5080 in native 1440pRT).

This...most importantly the last part...this is why you stutter struggle. Just the straight truth. I don't say it for my health, or to promote a brand, it's just reality.
Look at the recent cards that already are (like a TI Super). Did you expect that when you bought them? You should. That's what I'm saying. And the same things will happen to the 5070ti/5080 in those sitch.
 
Last edited:
I didn't get a 4090 and a 9800X3D to play at anything other than 4K ultra settings at as many fps as possible... Always 60+ :peace:

you can't in most new games. Not sure what you're playing.
 
you can't in most new games. Not sure what you're playing.
Well this is an article about AC: Shadows and that's one game right there where I can and do. I'm not too proud to not use frame gen where I need to either. I saved my money and paid a good chunk of change (almost $5K USD now or $7K if you want to include the TV) for this PC build to be able to do 4K/60 Max and it does in 99% of what I throw at it... With more than half getting to 90-120 fps range. I'm very happy with what I have. It's all good my friend!
 
How about sending the developers to Guantanamo prison for 3 months? The game can’t keep 60 FPS with midrange cards even with upscaling.

Btw 3070 performs clearly worse than 8gb 4060 ti. Is this a bug or the dirty Nvidia tricks with the drivers are getting dirtier?:nutkick:
The 5090 can't hit 60 fps smoothly without upscaling at 4k maximum settings and dips to mid 40s at native.
 
Looks like FSR 4 is working with OptiScaler with DLSS inputs. It's really not that hard to set up, but I am wondering why AMD is slow walking the automatic FSR 4 support for FSR 3.1 games. Feels like FSR 4 is still somewhat in beta, like FSR 3.1 support isn't guaranteeing good results yet.
 
Not that im interested in this game at all....or anything from Ubisoft anymore, but man does it look unappealing, idk its just not the world I would want to walk around in.
 
Well this is an article about AC: Shadows and that's one game right there where I can and do. I'm not too proud to not use frame gen where I need to either. I saved my money and paid a good chunk of change (almost $5K USD now or $7K if you want to include the TV) for this PC build to be able to do 4K/60 Max and it does in 99% of what I throw at it... With more than half getting to 90-120 fps range. I'm very happy with what I have. It's all good my friend!
Screenshot 2025-03-21 at 18-35-55 Assassin's Creed Shadows Performance Benchmark Review - 30 G...png


lol. Always 60+ eh?
 
I have come to the conclusion that most of the haters are people that only watch youtube videos and don't actually play the game.
Nuh, game is not bad, and is not good. A mid product with mid story, some times junky controls, bad feedback in combat and awful AI.

U can find better games to spend ur money and time right now.

And nobody hates it, people just memeing and making fun of the game and ubisoft.
Its exactly the game u want to watch on youtube, instead of playing urself.
 
Maybe it's just because I'm not looking at the quality comparisons blown up on my screen, but the difference between 'Low' and 'Max' looks to me like it amounts to a slight difference in detail, some AA, and something I could achieve by fiddling with a post-processing overlay for a few minutes. The graphics don't exactly blow me away, and it's asking me to tank my frames for what, slightly more realistic and sharp lighting/shadows?

I just don't see the point.

-100% these are some of the worst Min/Max comparison shots I've ever seen in the sense that there really doesn't seem to be much if any difference in quality at the expense of running at half the framerate.
 
I'm interested in knowing what exactly is RT in these scenes. And trying to find the differences with raster based AC game rendering.

Not that im interested in this game at all....or anything from Ubisoft anymore, but man does it look unappealing, idk its just not the world I would want to walk around in.
It looks... bland, not atmospheric. Its a very clean image, even in places where you'd expect some more, well everything; dust, particles, etc. Where are they? Volumetric light is also rather absent.
 
you're basing this off the fact that you are playing the game or are you just another tard that watches reaction videos by influencers?
I played the game. Don't want to play it anymore. Got bored after 2-3 hours. And glad I did not buy it, and used frens account, because if I actually spent 60$ on that, I would have spoken in more harsh and annoyed way.
The game itself is OK for 20$ during some summer sale.
 
I have come to the conclusion that most of the haters are people that only watch youtube videos and don't actually play the game.

That said, my wife has that "Ubisoft subscription thing" so she preloaded the game a couple days ago and got to play it last night after a driver update. From what I saw, very good looking game. The game almost justifies her upgrading her near 5 year old RX 6900XT. She can play at about 70FPS with 1080p balanced upscale and frame gen for quasi 4k with RT off. It still looks extremely good even with those trade offs. She said the combat system is like all the best ACs had an orgy and this is the baby they made.
I watched a YT video of Gameplay. It was over an hour long and I watched the whole thing. It looks great and has a really good story as well.
 
Perhaps you missed the part where I said I'm not too proud to not use frame gen where I need to to eh? Enjoy your day
So interpolated frames with artefacts count as performance for you? And all games support this, yes?
Well, enjoy then.
 
The game look awful without any RTGI..... Watch digital foundary videos...
It is better to turn RTGI (even if you have to sacrifice some setting) than running it max setting with RT completely off
 
So interpolated frames with artefacts count as performance for you? And all games support this, yes?
Well, enjoy then.
Come on I didn't say every game supports it nor did I say I use it in every game. In some games frame gen looks bad and in others you can't tell that it's a 'fake' frame. Here's what it all boils down to for me... I play games to relax or have fun or spend time with my son or etc etc etc. Nit-picking every frame in a game isn't one of them. It used to be but once I let it go and just enjoyed what is on the screen in front of me my enjoyment of games has gone back up to what it was as a young child when I just played the game without worrying about every little detail on the screen and how it got there.
 
Yes, the graphics are gorgeous, the game looks stunning, one of the absolute best I've ever seen, but it does not justify the hardware requirements, RDR2 looks slightly worse AND DOES NOT RELY ON RAY TRACING, but ACS is literally THRICE (3) as demanding
Just played it for some good eight hours: everything other than graphics in this game feel MID, a huge, solid MID, not even a 6, just an undeniable 5
 
1080p balanced upscale and frame gen for quasi 4k with RT off
:laugh: Optimised the way it's meant to be optimised

if people think this is an acceptable way of playing games in 2025 then that is the reason we keep being fed shit by GPU makers and told it's caviar!
 
Im getting about 75 fps on an RTX 3090 at 1440p, everything set to the highest settings, DLSS 4 + Framegen (DLSS to FSR mod). Runs extremely well and looks gorgeous.

As for the game itself I've been really enjoying it. I think it could potentially be one of the best AC games yet. It's also surprisingly polished compared to Valhalla, I havent run into any game breaking bugs or CTDs so far.
 
I had next to no good will to show towards this game to begin with, but anything that honestly barely scrapes by 60 fps on a 9800X3D/5090 system... haha.

My curiosity lead to randomly clicking a video timestamp that landed on this highly concise six word summary.

...5080 is now a 720p card.
 
Come on I didn't say every game supports it nor did I say I use it in every game. In some games frame gen looks bad and in others you can't tell that it's a 'fake' frame. Here's what it all boils down to for me... I play games to relax or have fun or spend time with my son or etc etc etc. Nit-picking every frame in a game isn't one of them. It used to be but once I let it go and just enjoyed what is on the screen in front of me my enjoyment of games has gone back up to what it was as a young child when I just played the game without worrying about every little detail on the screen and how it got there.
Don't mind such comments, sadly those are more common aound here than they should be.
Even mentionig that you use upscaling is like you worship the devil or something which I've experienced a good few times around here.
Just enjoy your games however you like them and don't stress over whatever the so called tech snobs say or think how you should play your games cause they dont worth your time in the first place. :)
 
Back
Top