• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

The Callisto Protocol Benchmark Test & Performance Analysis

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,667 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
The Callisto Protocol brings impressive graphics to the horror-survival genre. In our performance review, we're taking a closer look at image quality, differences between the graphical setting presets, VRAM usage, and performance on a selection of modern graphics cards.

Show full review
 
There must be something wrong with UE4. I remember playing Mass Effect: Andromeda on a 2070. In most scenes, the game capped out at 120 FPS, but in some, I barely passed 30... at 1080p.
 
There must be something wrong with UE4. I remember playing Mass Effect: Andromeda on a 2070. In most scenes, the game capped out at 120 FPS, but in some, I barely passed 30... at 1080p.
UE4 has always been a janky mess. From stuttery shaders to borked defaults for ultrawide and left-handed keyb & mouse players, to the most ridiculous defaults for that absurd Eye Adaptation / Auto Exposure effect that are the first thing I disable so often I know r.EyeAdaptationQuality=0 off by heart, I've always found it massively overhyped as some "evolution" of UE1-3.

As for the Callisto Protocol, I'm more interested in is it actually fun? There seem to be an awful lot of reviews (currently way down at Mixed) that say even ignoring all performance problems, gameplay is a "pale shadow of Dead Space", "soulless", "extremely janky combat", etc. Can't say I'm in any rush to blindly throw £50 at that even if it got 9999fps.
 
Hello,

Thanks for the benchmark & perf analysis!

Just asking: are you going to add RT perf analysis? If don't, what is the reason? (for this game or another)
 
Hello,

Thanks for the benchmark & perf analysis!

Just asking: are you going to add RT perf analysis? If don't, what is the reason? (for this game or another)

It's probably already a ton of work just doing the Rasterized performance and it really isn't apples to apples because AMD performance would be dismal. The games also pretty unoptimized on PC it probably was difficult to get accurate data as it is.
 
The Day2 update solved most of the shuttering issue for me.

But the biggest problem of this game is the game itself, not graphics.
 
There must be something wrong with UE4. I remember playing Mass Effect: Andromeda on a 2070. In most scenes, the game capped out at 120 FPS, but in some, I barely passed 30... at 1080p.
What

Me andro is ea frostbite
 
So, according to the graphics quality, do you think the hardware required by the game is ideal?
 
Why again a 5800x ??? And no 6950 XT ??? But when it Nvidia lets go 3080 ti/3090/3090 ti. Seriously it miss a big card. 6900XTXH die card was already a good upgrade versus 6900 XT, you dont test 6900 XT LC ok that good but no 6950 XT ?? Cmon it look like AMD have a cheap treatment.

But why asking... when we see that TPU gived a Editor choice award to a Asus RTX 4080 STRIX OC Overpriced :rolleyes:
 
The lowest quality sort of looks better than the ultra omega quality to me. It seems like there is some AA blurring the picture in the ultra mode.
 
Why again a 5800x ??? And no 6950 XT ??? But when it Nvidia lets go 3080 ti/3090/3090 ti. Seriously it miss a big card. 6900XTXH die card was already a good upgrade versus 6900 XT, you dont test 6900 XT LC ok that good but no 6950 XT ?? Cmon it look like AMD have a cheap treatment.

But why asking... when we see that TPU gived a Editor choice award to a Asus RTX 4080 STRIX OC Overpriced :rolleyes:
Because the 5800X kind of represents an average of what CPUs people have. That is, there's no point testing with a 13900K when maybe 5 people have one.

The 6950 XT is a 6900 XT with a slight overclock. No need for a separate test, imo. At least I can easily guess the expected performance of my 6750 XT from the 6700 XT data.

An award for any 4080 is a bit steep, I give you that, but I'm quite happy with the data set in this analysis.
 
game is already heading toward irrelevancy after just a few days, ouch.
 
That 5800x is holding the 4080/4090 back especially at lower than 4k and probably a bit at 4k if other games are anything to go by. Tests should be done on a 5800x3D, AMD 7000 series or Intel 13000 series or don’t bother.
 
@W1zzard
The game looks really amazing!
Two small errors on the conclusion page:
- even though the game use uses the previous generation Unreal Engine 4
- Due to the way Unreal Engine is used, the first time you see a graphics effect the first time, your gameplay will hang
 
That 5800x is holding the 4080/4090 back especially at lower than 4k and probably a bit at 4k if other games are anything to go by. Tests should be done on a 5800x3D, AMD 7000 series or Intel 13000 series or don’t bother.
5800X3D maybe... but honestly, how many people have an AMD 7000 series or Intel 13000 series CPU? Let's not forget that this review is supposed to provide everyday people with everyday performance data that they can expect on their own machines, and not a best-case scenario.
 
Why again a 5800x ??? And no 6950 XT ??? But when it Nvidia lets go 3080 ti/3090/3090 ti. Seriously it miss a big card. 6900XTXH die card was already a good upgrade versus 6900 XT, you dont test 6900 XT LC ok that good but no 6950 XT ?? Cmon it look like AMD have a cheap treatment.

But why asking... when we see that TPU gived a Editor choice award to a Asus RTX 4080 STRIX OC Overpriced :rolleyes:
Haven't had time to upgrade the GPU test system yet .. it takes 3 weeks of non-stop testing to retest all cards and all games. Right now I'm working on reviews of several upcoming GPUs and CPUs

AMD sent no RX 6950 XT reference card, I asked and explained that it won't be part of my comparisons, they still didn't bother .. "we don't have any cards". Can you send me a 6950 XT reference card? I could give you another card in return.

Intel Arc results?
Just asking: are you going to add RT perf analysis? If don't, what is the reason? (for this game or another)
Just not enough time, I barely managed to get this posted, and Portal
 
I'm surprised how the game made it through Q&A with that much stuttering

It made it through because they just ignored it with a shrug of the shoulders and a 'Well, that's PC gaming for you' attitude.
 
5800X3D maybe... but honestly, how many people have an AMD 7000 series or Intel 13000 series CPU? Let's not forget that this review is supposed to provide everyday people with everyday performance data that they can expect on their own machines, and not a best-case scenario.
And how many have 4080s and 4090s?, the whole point is using a fast enough CPU so it doesn’t bottleneck the GPU otherwise the whole test is pointless.

Why not test with an Intel 2700k or an AMD FX CPU because there might be plenty of people out there may have those, but then the graphs would be close to identical across the board and again completely wrong as by GPUs are being held way back.
 
I don't know how you ( W1zzard ) managed to benchmark this game, it stutters and results vary from one run to another, you have to make at least 1 run and the second is already better, also the GPU sometimes is not fully utilized, it sits at 60-70% and framerate is 30fps for me.
I think after some patches this game could look very different.
Amd sponsored title and Nvidia does better :laugh: some people will get fired for this.
 
Haven't had time to upgrade the GPU test system yet .. it takes 3 weeks of non-stop testing to retest all cards and all games. Right now I'm working on reviews of several upcoming GPUs and CPUs

AMD sent no RX 6950 XT reference card, I asked and explained that it won't be part of my comparisons, they still didn't bother .. "we don't have any cards". Can you send me a 6950 XT reference card? I could give you another card in return.



Just not enough time, I barely managed to get this posted, and Portal
It's okay, no worries Thanks for your work!

I guess it will be more interesting to benchmark RT when AMD Radeon 7900 XT(X) releases..?
 
What is it with test stand? Use of 5800X at the end of 2022 is just questionable, but RAM… 4000 MHz 20-23-23-42, those timings are atrocious, even considering 1T CR.
 
What is it with test stand? Use of 5800X at the end of 2022 is just questionable, but RAM… 4000 MHz 20-23-23-42, those timings are atrocious, even considering 1T CR.
13900K soon, just no time to setup the system

What CPU do you have? 5800X is still faster than what a vast majority of gamers use

 
Thanks @W1zzard for your efforts, sadly I'm giving this one a pass mainly due to terrible performance issues on PC. I'm not willing to pay full price for something that wasn't properly tested.
 
Back
Top