• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Shadow Of The Tomb Raider - CPU Performance and general game benchmark discussions

Joined
Jan 26, 2020
Messages
416 (0.21/day)
Location
Minbar
System Name Da Bisst
Processor Ryzen 5800X
Motherboard GigabyteB550 AORUS PRO
Cooling 2x280mm + 1x120 radiators, 4xArctic P14 PWM, 2xNoctua P12, TechN CPU, Alphacool Eisblock Auror GPU
Memory Corsair Vengeance RGB 32 GB DDR4 3800 MHz C16 tuned
Video Card(s) AMD PowerColor 6800XT
Storage Samsung 970 Evo Plus 512GB
Display(s) BenQ EX3501R
Case SilentiumPC Signum SG7V EVO TG ARGB
Audio Device(s) Onboard
Power Supply ChiefTec Proton Series 1000W (BDF-1000C)
Mouse Mionix Castor
Hello World!

Since Shadow Of The Tomb Raider is a particularly CPU intensive game, or better said, CPU performance has a major impact on FPS, it would be interesting to see how our setups handle the game.

For that, lets settle on very specific in-game settings:

Fullscreen
Exclusive Fullscreen
DirectX 12

DLSS OFF
Vsync OFF
Resolution 1920 X 1080

Anti-Aliasing OFF

Graphic Settings - Lowest Profile (please leave it at Lowest without any changes for the purpose of this test)

We shall gather and centralize the scores once a month, CPU and GPU, and try to find a correlation between them.

Here is the first one
:)

SOTR Low .png


:peace:
 
Last edited by a moderator:
5600X@4.8GHz 4000cl16 tweaked ram :) I set exclusive fullscreen but it keeps resetting :/

sottr 4.8 lowest.jpg
 
Last edited:
Speaking of CPU centric benchmarks in this game, I noticed that Resizable Bar on my system, 5800X and RX 6800XT, is at best not giving any performance benefits, but actually on average a super slight reduction in FPS, so either same results, but out of a run of 5 an average of minus 2 FPS with Resizable Bar, which could be run to run variance, but still is a minute reduction.

5600X@4.8GHz 4000cl16 tweaked ram :) I set exclusive fullscreen but it keeps resetting :/

View attachment 195219
There sometimes is some wonkiness on changing settings ingame, as in some settings needing a full game restart to correctly apply. Noticed it when switching between Lowest and Highest, would get the same score with Highest as with Lowest which is only possible if setting did not really apply :)
 
Here's mine, with 4.5 on all cores. IF and RAM running 1866 at 16-18-18-38-58 (secondary timings untouched). Using the demo, can somebody compare their results with the full version?

SOTTR.jpg
 
Here's mine, with 4.5 on all cores. IF and RAM running 1866 at 16-18-18-38-58 (secondary timings untouched). Using the demo, can somebody compare their results with the full version?

View attachment 196597
Those are actually quit good CPU scores for a 3300X, also keeping in mind this game loves moar cores. I also wonder if there is anyone else with a 3300X that would want to post his results here, we are bombarded to much by last gen CPU's and to few by "older" ones, this type of test brings generations into perspective :)
 
to few by "older" ones, this type of test brings generations into perspective
I've got a couple of older PCs, but they're all running 7 or XP, so obviously no DX12. DX11 would completely skew the results, so here's hoping :)
 
Deleted my old post because I forgot to turn off AA
4.525 all core, 3666 ram 16-16-19-16-36 plus a few secondary timings, +150ish gpu core
Screenshot (5).png
 
Testing some new RAM (and the latest GPU drivers)

5800x, 95w Eco Mode, 32GB (4x8) 3200 14-14-14-34

View attachment 197925
Interesting, I was playing with this thought in my head to lower the RAM frequency (from 3733 currently) and also lower the timings as much as possible, this game really loves ram with fast timings and low latencies :)
 
Interesting, I was playing with this thought in my head to lower the RAM frequency (from 3733 currently) and also lower the timings as much as possible, this game really loves ram with fast timings and low latencies :)

I've been trying to decide which kit I wanted to run, but I think test made the decision. These are standard XMP timings with GDM disabled. I'd like to see what I could get the timings to @ 3600.

Also just put the CPU back to stock (PBO Disabled and standard 105w)

SotTR-stock.jpg
 
I've been trying to decide which kit I wanted to run, but I think test made the decision. These are standard XMP timings with GDM disabled. I'd like to see what I could get the timings to @ 3600.

Also just put the CPU back to stock (PBO Disabled and standard 105w)

View attachment 197926
I seem to get better CPU results in SOTR with an all core of 4.7ghz on the 5800X vs PBO to 5ghz boost for some reason, but its very close. One thing that I am not really clear about, I get pretty crappy CPU Render scores for the settings I run, and that bugs me, have to find out why :D

304 vs your 417, that with 3733 dual rank 16CL tuned timings...
 
I seem to get better CPU results in SOTR with an all core of 4.7ghz on the 5800X vs PBO to 5ghz boost for some reason, but its very close. One thing that I am not really clear about, I get pretty crappy CPU Render scores for the settings I run, and that bugs me, have to find out why :D

304 vs your 417, that with 3733 dual rank 16CL tuned timings...

Yeah, that doesn't make any sense. That's a really large difference for such similar hardware. Has me curious too. :wtf:
 
This is a weird game engine, so, played a bit with RAM timings, especially Trfc which I think was a bit too tight, loosened it, tested again:

comp.png


Initial result on the left, todays result on the right. What is really weird, I got a 40 fps increase in Average CPU Render, but a 40 fps decrease in GPU score, with the final result of the same overall FPS :roll:
Granted, its on different drivers, and there seems to be a slight performance benefit for the previous AMD one 21.3.1, but still, this is odd :laugh:
 
This is a weird game engine, so, played a bit with RAM timings, especially Trfc which I think was a bit too tight, loosened it, tested again:

View attachment 197931

Initial result on the left, todays result on the right. What is really weird, I got a 40 fps increase in Average CPU Render, but a 40 fps decrease in GPU score, with the final result of the same overall FPS :roll:
Granted, its on different drivers, and there seems to be a slight performance benefit for the previous AMD one 21.3.1, but still, this is odd :laugh:

yeah the 21.3.1 AMD drivers are insanely good with FPS boost... I'm considering never upgrading them from that. AMD seemed to nail something right on them, and I doubt they even know what they did.
 
yeah the 21.3.1 AMD drivers are insanely good with FPS boost... I'm considering never upgrading them from that. AMD seemed to nail something right on them, and I doubt they even know what they did.
Me too, even though I like the Vivid Gaming thing in the last driver, 21.3.1 seems better for sustained performance for some reason. Really good driver.
 
Me too, even though I like the Vivid Gaming thing in the last driver, 21.3.1 seems better for sustained performance for some reason. Really good driver.

the vivid thing is terrible. look at techpowerup after you turn on vivid. the gray turns to pink... that's not right...

just calibrate your monitor better to begin with and you don't need vivid.
 
the vivid thing is terrible. look at techpowerup after you turn on vivid. the gray turns to pink... that's not right...

just calibrate your monitor better to begin with and you don't need vivid.
Might be wrong for others, but on mine it really looks better, does not really change colors, just makes things a bit more ... "vivid" :D . And I am speaking only about how games look, not general windows.

I am trying some more RAM timings to see how the CPU stats in SOTR change, its interesting in a way, though sort of unpredictable.
 
CIiuWxu.jpg


This is a weird game engine, so, played a bit with RAM timings, especially Trfc which I think was a bit too tight, loosened it, tested again:

View attachment 197931

Initial result on the left, todays result on the right. What is really weird, I got a 40 fps increase in Average CPU Render, but a 40 fps decrease in GPU score, with the final result of the same overall FPS :roll:
Granted, its on different drivers, and there seems to be a slight performance benefit for the previous AMD one 21.3.1, but still, this is odd :laugh:

how are you getting so much of a higher score than me?
 
Since Shadow Of The Tomb Raider is a particularly CPU intensive game, or better said, CPU performance has a major impact on FPS, it would be interesting to see how our setups handle the game.
How's the difference between the benchmark and actual gameplay?
 
CIiuWxu.jpg




how are you getting so much of a higher score than me?
Couple of reasons, first being the 6800XT I have vs your 6800, my 6800XT is running at 2.7Ghz with over stock power limits, 400w basically, water cooled, and the second would be, and I know that for sure since I had the 5600X before the 5800X, that I run a 5800X, and this game loooooooooooooves extra cores :)

Was speaking to a mate that has the same setup I have, except a 5900X, that one gets an extra 10-15 FPS at 1080, and the 5950X gets another extra 10 on top of that. Its weird that each 2 extra cores would give extra FPS.

How's the difference between the benchmark and actual gameplay?
If you mean in regards to FPS only, I would not know since I play it at 3440X1440, I only test it at 1080 in this thread to see the CPU impact on FPS.
I can speak about the difference at 3440x1440, the benchmark gives a bit higher numbers there than normal gameplay, but its rather fluctuating depending on area, so would assume the same is true for 1080.
 
Back
Top