• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Crysis Warhead Post-release Hardware Tests Show Neutral Improvements

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,683 (7.42/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Crytek has worked closely with NVIDIA in the development of the Crysis franchise, and it is a known fact that the original title was optimised for GeForce hardware. The original title, however, was criticised for being too demanding with hardware requirements, which may have contributed to the luke-warm sales of the game. With Crysis Warhead however, Crytek promises to have improved the game engine to work better with today's hardware. PC Games Hardware (PCGH) put the new game to test, not with the prime objective to review it, but to review its performance with today's hardware. There are positives that can be drawn from the findings of the review. The first being, that the game performs to the potential of installed hardware, be it GeForce or Radeon. There were very minor deviations of the hardware's performances from synthetic tests that show their capabilities. For example, Radeon HD 4870 performed neck and neck with GeForce GTX 260 in the "gamer mode", with the former achieving a higher minimum frame-rate. This was also seen with the game's "enthusiast mode" albeit the GeForce chipping away with a higher average frame-rate. The trend continued with the rest of today's GPUs, which indeed is a positive sign.



With CPU, the game's performance in many ways was proportional the CPUs' performance. However, quad-core and dual-core processors nearly exchanged blows to bring out an interesting mix of scores. Core 2 Extreme QX6850 exchanged blows with Core 2 Duo E8400 that shares the same clock speed of 3.00 GHz, with QX6850 providing only a nominal improvement over the E8400. The exact opposite happened with Core 2 Quad Q6700 and Core 2 Duo E6750, with the dual-core chipping away a 1 fps lead. The CPU scores go on to show that the game is still largely comfortable with today's dual-core processors, with quad-core ones not offering any real advantage. With system memory, it was seen that in a 64-bit Windows Vista environment, having 4 GB of system memory did help step up performance, the increment wasn't all that nominal either.

View at TechPowerUp Main Site
 
Last edited:
I saw these charts before but they were in German.

Shows promise for Warhead's scaling with GPU power. I hope we can see some more charts like this featuring SLi and Crossfire setups. I doubt we'll have to wait very long... Warhead will soon replace the original Crysis on the hotlist for hardware reviews.
 
good, that it is finally running better on ATI GPU's

or it may be it is still not running as best as it could on ATI GPU's , but the raw power of ATI GPU's makes them perform as good as NVIDIA GPU's
 
Hell, even if it just runs almost as good on ATI's cards, that would be a vast improvement from the heavy bias present in regular Crysis.

I'm surprised that the X2 6000+ chip was able to maintain like 50% more stable minimum framerates that equivalent Core 2 chips, even though it's a bit slower with the max -- is it always like that?
 
I'm surprised that the X2 6000+ chip was able to maintain like 50% more stable minimum framerates that equivalent Core 2 chips, even though it's a bit slower with the max -- is it always like that?

Those are minimum and avg, no max value.
 
Once I got crysis wars running, I found the performance to only be slightly better than the original crysis. I hope to find some custom configs for an 8800gt though.
 
I'll call BS on some of these charts.

Compare GPU2 Bench and OS Bench

Both use same settings in game Very High 1680x1050 16:1AF. Compare the GTX280 Vista numbers.
In GPU2 the GTX280 on Vistax64 2GB shows Min8 Avg22
In OS both Vistax64 2GB modes (DX9/DX10) show Min0 Avg20/21
And the only system showing Min8 Avg22 is XP 2GB

My XP 2GB E6750 8800GT 1680x1050 noAA Gamer is a solid 30 Average.

Just saying
 
I can't believe the Athlon X2 6000 provides better frame rates than the Phenom 9850.... WOW. I mean maybe I'll just pop in my 6400+ AGAIN...
 
This test looks kinda weird IMO. Look at the minimum FPS; in first test GTX280 gets the same min FPS as 1900 XTX (!!!) and in second the same as 3870.
 
Yeah, these charts are pants. Compare the CPU/GPU between 1 and 3 charts.
He goes from Vista64 2GB Gamer 1680x1050 to XP VHigh 1024x768, which screws up a direct comparison but still,
QX9850/9800GTX+ goes from Min16 Avg28 to Min35 Avg40

Well, I must say congrats on doing all the tests though. Too bad they didn't sync up more of the details between the runs.
 
Best minimum FPS with enthusiast settings for my GPU (9800GTX+), w00t :) That second screen just made my day, beating HD4870, GTX260 and GTX280.
 
I can't believe the Athlon X2 6000 provides better frame rates than the Phenom 9850.... WOW. I mean maybe I'll just pop in my 6400+ AGAIN...

its all down to the clock speed... if you overclock that quad AMD youll get more out of it... ;)
These tests were done at stock speeds on these chips
 
why no 4870 X2 on the chart?

oh i understand it must be off the chart! haha jk.
 
  • Like
Reactions: Kei
Those are minimum and avg, no max value.

Ah -- you're right. So basically that means that the Intel scores were all over the place.

I can't believe the Athlon X2 6000 provides better frame rates than the Phenom 9850.... WOW. I mean maybe I'll just pop in my 6400+ AGAIN...

Obviously this game isn't quad-core optimized. ;)
 
wtf with e6320 vs e6700/e6750 at min fps? o0

1,87 ghz faster than 2.67... this cant be true... the same as the xp performance....
 
wtf with e6320 vs e6700/e6750 at min fps? o0

1,87 ghz faster than 2.67... this cant be true... the same as the xp performance....

Yeah, those are kinda fishy. But there is no E6700, it's Q6700, they've marked it wrong.

Also why would 8800GTS drop so much on min FSP versus 8800GT with AF enabled. There must be some random lag going on with Warhead. Some say it runs smooth and some with better rigs say it lags all over the place. Avarage FPS seems more reliable, as there is no way to tell if those low FPS ones are just sudden stops and most of the time min FPS is much higher.

Still like the min fps for my card though :)
 
wasnt gonna buy this... but sitting here (drinking a beer ) i thought f*ck it... logged into steam and its downloading now.... if anything itll be a good bench on my pc... and afterall i managed to play the first version on my 24" screen so im sure ill be fine now as my 4870 gives me good FPS on the old one with everything high ;)
 
Yeah, those are kinda fishy. But there is no E6700, it's Q6700, they've marked it wrong.

Also why would 8800GTS drop so much on min FSP versus 8800GT with AF enabled. There must be some random lag going on with Warhead. Some say it runs smooth and some with better rigs say it lags all over the place. Avarage FPS seems more reliable, as there is no way to tell if those low FPS ones are just sudden stops and most of the time min FPS is much higher.

Still like the min fps for my card though :)

dude there is a E6700 & Q6700... theres also a QX6700...
the X6600 is clocked at 2.4ghz... the X6700 is clocked at 2.66ghz with a higher multi
 
dude there is a E6700 & Q6700... theres also a QX6700...
the X6600 is clocked at 2.4ghz... the X6700 is clocked at 2.66ghz with a higher multi

There is no E6700 in the TEST ;) I do know it exists :)
 
If u want to compare then why don't you compare with original Crysis. TOTAL BS!!!

How can you be sure then Warhead is optimized. STUPIDS!
 
If u want to compare then why don't you compare with original Crysis. TOTAL BS!!!

How can you be sure then Warhead is optimized. STUPIDS!

BTW, the test looks like an nVidia show. I am no fan boy but where is 4870 X2?
 
Again, test show that the 4800 series are a strong performer when AA is used.
 
The games runs the same with no aa and 4x on enthusiast DX10. I really love having a HD 4800 card :D
 
Back
Top