• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Assassin's Creed Odyssey: Benchmark Performance Analysis

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,747 (3.75/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Assasin's Creed Odyssey is this year's version of the epic series and takes you to ancient Greece. We tested the game at 1080p, 1440p and 4K using a large selection of graphics cards, including the RTX 2080 and RTX 2080 Ti. We also investigated VRAM requirements and CPU performance scaling.

Show full review
 
Last edited:
Thanks for the test @W1zzard ! I like the setting being a Greek but I cannot deny the -now classical- unoptimised character of the game engine which is very usual for years now since the ac unity. As for the VRAM usage, I suggest you try again with let's say an RX580 8GB to make sure if the game asks for the same VRAM when the GPU has less than the tested one (mosts games behave that way caching the VRAM even when not needed).
 
It's a crap son..(Beneath the steel sky!)
 
2080Ti was heavily promoted as 4K ready. 4K my arse.
Oh, I almost completely forgot the word "ready".
 
if the game asks for the same VRAM when the GPU
The performance hit for cards with less than 8 GB is very pronounced and can only be explained by the lack of VRAM
 
DLSS would be very welcome here.
 
Thanks for the tests @W1zzard!

My problem is the whole premise of the game. Virtually every facet of the whole assassin mythos had its beginnings and was explained in Origins, because well, it was the origin. The rest of the quirks of the series came after.

Going earlier in history with Odyssey is merely a cash-grab by using the AC title.
 
2080Ti was heavily promoted as 4K ready. 4K my arse.
Oh, I almost completely forgot the word "ready".

AMD's fastest GPU in history can't even manage average 60FPS at 1080p and you are complaining about 2080Ti's 4K performance?

just turning down the volumetric clouds from highest to medium which is just a waste of GPU Resourses will give you a +40% FPS boost and everything else on max, then 4K will be "ready"
 
Ya, very poorly optimized IMO. Completely unacceptable that a game that demanding doesn't have multi GPU support. Will definitely not buy this one.
 
AMD's fastest GPU in history can't even manage average 60FPS at 1080p and you are complaining about 2080Ti's 4K performance?

just turning down the volumetric clouds from highest to medium which is just a waste of GPU Resourses will give you a +40% FPS boost and everything else on max, then 4K will be "ready"

Yeah, that comment makes no sense. What normal person with a brain would say stuff like that. It's the fastest card out now, nVidia can't know what games you run at what settings.
 
AMD's fastest GPU in history can't even manage average 60FPS at 1080p and you are complaining about 2080Ti's 4K performance?

just turning down the volumetric clouds from highest to medium which is just a waste of GPU Resourses will give you a +40% FPS boost and everything else on max, then 4K will be "ready"

did I mention about AMD?
I was criticizing the performance of 2080Ti, triggered much?

Yeah, that comment makes no sense. What normal person with a brain would say stuff like that. It's the fastest card out now, nVidia can't know what games you run at what settings.

meh..
 
Last edited:
Heavily DRM laced? 2080Ti can't break 90fps...wtf?
 
Could we maybe get "Ultra" and "Acceptable" settings tested? GTA V looks just fine on lower settings, for instance, and uses less than 3GB of VRAM on the lower settings, making it playable on older or lower-end cards.
 
Can someone explain the higher frame rates on a 4C/8T configuration than on 6C/12T (I'm assuming it's the same clock speed?).
 
Can someone explain the higher frame rates on a 4C/8T configuration than on 6C/12T (I'm assuming it's the same clock speed?).
I'd say random variation between runs
 
I am not sure how you got 26 for Vega64. I have a 1800x and Vega64 Stock speed but unvolt and I am getting 45 to 36 and worst goes to 32 at 4K everything maxed out. I have the latest driver.
 
If a gpu costs 1300€ and cant run a game at locked 60fps maxed out, ofc we gonna make fun of it.

I absolutely hate the pricing of gtx 2080ti. This is one game where you cant really blame any graphics card. I mean when you have games that look better and can easily run 1440p on vega 64 at ultra and then you have this game that is only pushing 43fps? ROFL! Something is really wrong with the engine. I mean its making a 1200 dollar card look worthless lol. This is one of those times where you blame an old unoptimized engine. I wouldn't run this thing on ultra on any cards. Run it very high and you will likely get decent FPS boost.
 
Lol, ubisoft is going backwards at an incredible rate here. Bravo.
 
I am not sure how you got 26 for Vega64. I have a 1800x and Vega64 Stock speed but unvolt and I am getting 45 to 36 and worst goes to 32 at 4K everything maxed out. I have the latest driver.
Ryzen that much faster than i7? Or undervolting adding much performance? Only @W1zzard can test that to make it sure I guess...
 
Playing it now and it's a real good game. If you need more performance hit the cloud settings first it's a hog.
 
2080Ti was heavily promoted as 4K ready. 4K my arse.
Oh, I almost completely forgot the word "ready".

AMD sponsored games are promoted as excellent performance games. So what happened? What AMD black boxes were used?
 
AMD sponsored games are promoted as excellent performance games. So what happened? What AMD black boxes were used?

dunno, why you asked me? do I look like AMD's employee or developer team?
got triggered also?

Lol, ubisoft is going backwards at an incredible rate here. Bravo.
indeed, and I didn't see anything special (in terms of graphical thingy) in this games except this games required a tremendous amount of GPU power and VRAM. /facepalm
 
Last edited:
Back
Top