• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

The Callisto Protocol Benchmark Test & Performance Analysis

Thanks @W1zzard for your efforts, sadly I'm giving this one a pass mainly due to terrible performance issues on PC. I'm not willing to pay full price for something that wasn't properly tested.
I also feel like the gameplay isn't that impressive, check the various reviews. On the other hand I do want to complete the story. Cheat Engine speedhack x1.3 helps a lot to overcome the slow movement ;)
 
Is a good game? Do you recommend it?
People say good things and bad things about the game.
 
I also feel like the gameplay isn't that impressive, check the various reviews. On the other hand I do want to complete the story. Cheat Engine speedhack x1.3 helps a lot to overcome the slow movement ;)
That sounds a lot like original Dead Space, that one also felt a bit sluggish from what I recall.
 
The visual fidelity may be seriously amped up, but otherwise this is clearly a mediocre UE4 implementation and performs as such: the PC stutterfest, the huge discrepancies between the two leading consoles, the lower results on AMD hardware, etc.
For reference, if you are wondering what is a great UE4 implementation: Gears 5.
 
The best looking videogames in 2022, A plague Tale Requiem and The Callisto Protocol, still use less than 10GB of VRAM at any resolution.
 
I was given this game for free, now that it's been patched for stuttering I'll probably give it a go.
 
I also feel like the gameplay isn't that impressive, check the various reviews. On the other hand I do want to complete the story. Cheat Engine speedhack x1.3 helps a lot to overcome the slow movement ;)
I use this all the time in slow games. I usually start at 2x, and more commonly end at 2.5 - 3x and run it at that.
Such a nice life hack. :laugh:
 
Very unoptimized game. Will become better in performance after 2-3 months maybe. Gameplay and story are the most critical aspects though.
 
RX 6650, 3060 Ti on 1360x768?
 
There must be something wrong with UE4. I remember playing Mass Effect: Andromeda on a 2070. In most scenes, the game capped out at 120 FPS, but in some, I barely passed 30... at 1080p.
UE4 has been around for a long time and a ton of games were built on it...did they release a patch that's causing this problem?

The best looking videogames in 2022, A plague Tale Requiem and The Callisto Protocol, still use less than 10GB of VRAM at any resolution.
That's to be expected for UE4 though...when games that are built on UE5 ground up, VRAM will likely shoot up in use.
 
It's probably already a ton of work just doing the Rasterized performance and it really isn't apples to apples because AMD performance would be dismal.
I don't follow how testing the same workload wouldn't be apples to apples? It's an AMD sponsored title apparently, which usually means the RT effects are minimal and lower resolution so that RDNA2 cards don't tank so hard.
 
I don't follow how testing the same workload wouldn't be apples to apples? It's an AMD sponsored title apparently, which usually means the RT effects are minimal and lower resolution so that RDNA2 cards don't tank so hard.

At this point anyone buying RDNA 2 and probably 3 doesn't care about RT. I personally want it to be tested but I can see why it isn't from a workload perspective.

Same reason why DLSS/FSR isn't tested even though a lot more people probably turn them on vs using RT.
 
Haven't had time to upgrade the GPU test system yet .. it takes 3 weeks of non-stop testing to retest all cards and all games. Right now I'm working on reviews of several upcoming GPUs and CPUs

AMD sent no RX 6950 XT reference card, I asked and explained that it won't be part of my comparisons, they still didn't bother .. "we don't have any cards". Can you send me a 6950 XT reference card? I could give you another card in return.



Just not enough time, I barely managed to get this posted, and Portal
Thanks you alot for the reply W1zzard !! We can better understand now, maybe next time put a (*) for explain why (Like Arc and 6950 XT) and for the CPU during the review but it super you taked the responsability to ask at least for the 6950 XT. Very Sad AMD don't sent one, but you know you can just buy or ask any AIB and put same frequency ect or just mention like many others reviewer the name of the card it give us a idea it better then nothings. Because as you can see in screenshot 3 there a big step from

AMD 6900XT REF
AMD 6900 XT LC =~ AIB 6900XTXH DIE card
AMD 6950 XT

So at least having a 6950 XT REF or not it not possible would be great and that for any card that have that type of jump.
452a259b-30b3-4839-a44d-0d9f468e054e.png


OC3.png



Screenshot_20221208-024813_Gallery.jpg
 
any AIB and put same frequency
All cards in my tests are reference, using an overclocked AIB card certainly wouldn't be fair. Setting same frequency doesn't work on RDNA2, because whatever you set in Radeon Settings is just a general idea, and the GPU will do what it wants on its own. Also several power limit values in BIOS will be different on the card, and fan settings, which affect temperature, which affects clock speeds.
 
All cards in my tests are reference, using an overclocked AIB card certainly wouldn't be fair. Setting same frequency doesn't work on RDNA2, because whatever you set in Radeon Settings is just a general idea, and the GPU will do what it wants on its own. Also several power limit values in BIOS will be different on the card, and fan settings, which affect temperature, which affects clock speeds.
Yes I know I updated my post the 3 screen was giving error. It was just for have one like better then nothings but I totaly understand. Just up you team can buy one now that they are cheap, it will userfull for plenty of review to come. Or deal one with AMD ect
 
Cheat Engine speedhack x1.3 helps a lot to overcome the slow movement ;)


At the bottom, do I download Lite version or the version above it? (its been many years since I used this program)
 

At the bottom, do I download Lite version or the version above it? (its been many years since I used this program)
I have no idea .. I'm still using 7.0 and haven't updated it for ages. afaik there's some adware bundled with some installers of Cheat-Engine, so watch out fo rthat
 

At the bottom, do I download Lite version or the version above it? (its been many years since I used this program)
Use the lite version since it contains "less/different warnings". :laugh:
I have 5.4 and 6.1 as a newer version on my PC. Go figure.
 
Game is seriously broken, on my rx 6700 with vsync enabled and 60 fps limit in game i see 45fps and gpu sits at 40-60% utilization, i remove all limits and framerate jumps to 50-55fps and gpu utilization to 99%, but, gpu power consumption is at 100-120W.
In all other games this cheap GPU sits at around 140-150W.
 
I was given this game for free, now that it's been patched for stuttering I'll probably give it a go.
Okay, this game runs like absolute shit on both a 3070@1440p and on an RX6700@1080p

The stutters are mostly fixed, but that still means stuttering. Given how it looks and runs on PS5, this is inexcusable - it's solid proof that the PC release received insufficient effort and was phoned in. I didn't even pay for this (AMD gave me a free copy) and I'm not sure it's worth my time.

It's a shame, because I loved Dead Space, but this is a poor facsimile of DS with a rocky launch, mediocre gameplay, numerous technical problems, pitiful optimisation (it's not sure if it's been optimised in any way at all) and seemingly minimum effort from the developer for PC gamers. It makes for some pretty screenshots and that's about it, IMO.
 
I too got it for free, i would never buy this piece of crap game.
PC games are sometimes just stupid in what they want to achieve and don't make sense, for example when they ported Horizon Zero Dawn to PC, performance was awful and it was difficult to spot the difference from console graphics, some things were added but wasted for things you don't necessarily look at that much, biggest performance hit was from volumetric clouds and if i remember correctly some youtuber said the clouds look a bit more puffy.
They would never waste compute power on consoles for things you don't notice, on PC no worries, they have RTX 4090's to play PS5 games.
 
I'd love to see the new Radeon 7900xt and xtx added to this W1zzard !
 
UE4 has always been a janky mess. From stuttery shaders to borked defaults for ultrawide and left-handed keyb & mouse players, to the most ridiculous defaults for that absurd Eye Adaptation / Auto Exposure effect that are the first thing I disable so often I know r.EyeAdaptationQuality=0 off by heart, I've always found it massively overhyped as some "evolution" of UE1-3.

As for the Callisto Protocol, I'm more interested in is it actually fun? There seem to be an awful lot of reviews (currently way down at Mixed) that say even ignoring all performance problems, gameplay is a "pale shadow of Dead Space", "soulless", "extremely janky combat", etc. Can't say I'm in any rush to blindly throw £50 at that even if it got 9999fps.
I found it quite fun. Very linear yes but great entertainment in AAA package. Cinematic experience with my settings on my 42” OLED. Yes performance issues day 1. But compared to now with patch 2-3, it’s a different game. For me Raytracing is a marketing hoax so what I found to be absolutely best quality vs performance is 4K, HDR, Everything high, no RT, no AA, no motion blur, no film grain, no depth of field….but most important sort of is forcing sharpen on 75-100% in Nvidia driver …. Makes a BIG difference ! I HATE blurry games/textures and this REALLY makes a difference, especially when AA is disabled…. If performance demands are too high you can with no problem set shadows and volumetric to medium. I also used a frame limiter to 80 fps, to reduce stress and power from my 4090 as this type of game does not in any way need 120hz. The price was too high though …. Anyway I think it was a great game and experience but with no doubt I am looking forward to Dead Space Remake January 2023. I also always have NV Control panel texture to HIGH QUALITY vs default QUALITY. It’s does makes it a tiny bit sharper overall at virtually no performance cost.

I don't know how you ( W1zzard ) managed to benchmark this game, it stutters and results vary from one run to another, you have to make at least 1 run and the second is already better, also the GPU sometimes is not fully utilized, it sits at 60-70% and framerate is 30fps for me.
I think after some patches this game could look very different.
Amd sponsored title and Nvidia does better :laugh: some people will get fired for this.
I think it’s a CPU bottleneck with your GPU usage.. this game does not use 100% CPU when CPU limited. Digital Foundry mentioned that a CPU could be at 80-85% but not go further and then be cpu bottleneck …. I can max my GPU 4090 to 100 % with my 13900K. One of the tricks they used in the patch to combat stutter was disabling hyper threading I noticed. So the game are at least in my case limited to only physical P core, no HT and no E cores … so the used cores need to be very powerful when not being able to hit 100% either.

Okay, this game runs like absolute shit on both a 3070@1440p and on an RX6700@1080p

The stutters are mostly fixed, but that still means stuttering. Given how it looks and runs on PS5, this is inexcusable - it's solid proof that the PC release received insufficient effort and was phoned in. I didn't even pay for this (AMD gave me a free copy) and I'm not sure it's worth my time.

It's a shame, because I loved Dead Space, but this is a poor facsimile of DS with a rocky launch, mediocre gameplay, numerous technical problems, pitiful optimisation (it's not sure if it's been optimised in any way at all) and seemingly minimum effort from the developer for PC gamers. It makes for some pretty screenshots and that's about it, IMO.
I found it quite fun. Very linear yes but great entertainment in AAA package. Cinematic experience with my settings on my 42” OLED. Yes performance issues day 1. But compared to now with patch 2-3, it’s a different game. For me Raytracing is a marketing hoax so what I found to be absolutely best quality vs performance is 4K, HDR, Everything high, no RT, no AA, no motion blur, no film grain, no depth of field….but most important sort of is forcing sharpen on 75-100% in Nvidia driver …. Makes a BIG difference ! I HATE blurry games/textures and this REALLY makes a difference, especially when AA is disabled…. If performance demands are too high you can with no problem set shadows and volumetric to medium. I also used a frame limiter to 80 fps, to reduce stress and power from my 4090 as this type of game does not in any way need 120hz. The price was too high though …. Anyway I think it was a great game and experience but with no doubt I am looking forward to Dead Space Remake January 2023. I also always have NV Control panel texture to HIGH QUALITY vs default QUALITY. It’s does makes it a tiny bit sharper overall at virtually no performance cost.
Game is seriously broken, on my rx 6700 with vsync enabled and 60 fps limit in game i see 45fps and gpu sits at 40-60% utilization, i remove all limits and framerate jumps to 50-55fps and gpu utilization to 99%, but, gpu power consumption is at 100-120W.
In all other games this cheap GPU sits at around 140-150W.
use Fullscreen mode … fps unlimited v sync off but forced in driver with triple buffer enabled. best result for me….. set shadow and volumetric to medium or low. Makes a huge impact ….no AA, sharpen in driver 75% ….. cleans up the blurry look which I hate And make it razor sharp. I also always have NV Control panel texture to HIGH QUALITY vs default QUALITY. It’s does makes it a tiny bit sharper overall at virtually no performance cost.

you could be cpu limited. In this game you are cpu limited when the cores hit 85% around, for some reason it cannot take cpu cores up to 100%. So check that ….they removed hyper threading as a step to remove stutter so fewer cores are under heavier load also. Digital Foundry made som investigation into this if you want to dig into cpu limitation and behavior with not able to 100%.

Okay, this game runs like absolute shit on both a 3070@1440p and on an RX6700@1080p

The stutters are mostly fixed, but that still means stuttering. Given how it looks and runs on PS5, this is inexcusable - it's solid proof that the PC release received insufficient effort and was phoned in. I didn't even pay for this (AMD gave me a free copy) and I'm not sure it's worth my time.

It's a shame, because I loved Dead Space, but this is a poor facsimile of DS with a rocky launch, mediocre gameplay, numerous technical problems, pitiful optimisation (it's not sure if it's been optimised in any way at all) and seemingly minimum effort from the developer for PC gamers. It makes for some pretty screenshots and that's about it, IMO.
Check my other reply in this thread. My experience and ultimate visual and performance settings IMO.

Is a good game? Do you recommend it?
People say good things and bad things about the game.

I found it quite fun, exciting and engaging . Very linear yes but great entertainment in AAA package. Cinematic experience with my settings on my 42” OLED. Yes performance issues day 1. But compared to now with patch 2-3, it’s a different game. For me Raytracing is a marketing hoax so what I found to be absolutely best quality vs performance is 4K, HDR, Everything high, no RT, no AA, no motion blur, no film grain, no depth of field….but most important sort of is forcing sharpen on 75-100% in Nvidia driver …. Makes a BIG difference ! I HATE blurry games/textures and this REALLY makes a difference, especially when AA is disabled…. If performance demands are too high you can with no problem set shadows and volumetric to medium. I also used a frame limiter to 80 fps, to reduce stress and power from my 4090 as this type of game does not in any way need 120hz. The price was too high though …. Anyway I think it was a great game and experience but with no doubt I am looking forward to Dead Space Remake January 2023. I also always have NV Control panel texture to HIGH QUALITY vs default QUALITY. It’s does makes it a tiny bit sharper overall at virtually no performance cost.
 
Last edited:
For me Raytracing is a marketing hoax so what I found to be absolutely best quality vs performance is 4K, HDR, Everything high, no RT, no AA, no motion blur, no film grain, no depth of field….but most important sort of is forcing sharpen on 75-100% in Nvidia driver …. Makes a BIG difference ! I HATE blurry games/textures and this REALLY makes a difference, especially when AA is disabled…. If performance demands are too high you can with no problem set shadows and volumetric to medium. I also used a frame limiter to 80 fps, to reduce stress and power from my 4090
Sounds like I just need to buy a 4090. /s

I'm running a 3070 and 5800X, Raytracing off, 40ish FPS on Ultra, 60ish FPS on high. I need custom settings to pick and choose texture and lighting without murdering framerate on because "high" looks like a PS3 game, but ultra really brings the stutters, the loading lag, and wild framerate swings from 165 to 30.

My monitor's freesync window is at 48Hz, so I lock the framerate to 50fps in game to try and get the most consistent, stutter-free experience and even with FSR enabled, it's still dropping frames all over the place. I know my rig isn't exactly high-end by current standards, but this is simply a terrible, rushed port of a console game with all kinds of issues and near-zero optimsation. The consoles versions prove that it's a shit port. On top of that, HDR seems to be awful on my displays with this game, and the gamma/black levels were wrong by default, something that's SUPER IMPORTANT for a dark and moody horror game.

The game came free with an AMD RX6700 10GB and that's a good 25% slower than the 3070. but at least it can put up a 1080p60 experience when it's not stuttering.
 
Last edited:
Back
Top