Tuesday, January 30th 2007

First benchmarks of Crysis appear

The first major DirectX10 title from EA, Crysis, is a game I'm sure a lot of us are looking forward to. The folks over at Hardspell managed to get a pre-release copy of Crysis, on Windows Vista using drivers that aren't available to the public yet. These preliminary benchmarks certainly don't look too good. A system that we'd call overkill high-end simply choked on Crysis. The system was running a QX6700, 2 G80's, dual raptors in RAID 0, 4GB low latency RAM, and a PhysX card.
  • At 1024x768, Crysis ran at 44 FPS average, with a mininum of 33 FPS and a maximum of 69 FPS.
  • At 1280x1024, Crysis ran at 37 FPS average, with a minimum of 17.3 FPS, and a maximum of 52 FPS.
  • At 2560x1600 Crysis ran at a whopping 1.7 FPS average, with a minimum of .2 FPS, and a maximum of around 5 FPS.
Again, neither Crysis or the official NVIDIA driver has been released, and games with the PhysX card enabled have noticeably lower FPS than the non-PhysX counter-part.Source: VR-Zone and Couterspell
Add your own comment

42 Comments on First benchmarks of Crysis appear

#1
Sasqui
It's going to take a while before the game developers even know what to do with DX10... let alone get into optimization. Another reason to stay with XP for the meantime.
Posted on Reply
#2
EastCoasthandle
That's a blow against Vista, until shown otherwise. I wonder what the results would be if used in XP?
Posted on Reply
#3
jydie
Wow, if the requirements are that high for the final release then they will not be selling very many copies. I probably could not even play Crisis at 640x480 with my Radeon X850XT. :( I like my games to look good, but the developers need to cater to those of us that can not afford $500 video cards. The best selling games are those that can scale down and run on systems that are 2-3 years old. One reason why World of Warcraft has done so well is because the video settings can be adjusted so that it is playable on a high percentage of the current systems.

Oh well, maybe I will be able to play Crisis within a couple of years... by then the current $500 video cards should be in the $100-$150 range.
Posted on Reply
#4
pt
not a suicide-bomber
i won't be playing crysis for a long time :p
i will go with s.t.a.l.k.e.r. :)
Posted on Reply
#5
Darksaber
W1zzard's Sidekick
S.t.a.l.k.e.r Ftw!!!
Posted on Reply
#6
overcast
by: jydie
Wow, if the requirements are that high for the final release then they will not be selling very many copies. I probably could not even play Crisis at 640x480 with my Radeon X850XT. :( I like my games to look good, but the developers need to cater to those of us that can not afford $500 video cards. The best selling games are those that can scale down and run on systems that are 2-3 years old. One reason why World of Warcraft has done so well is because the video settings can be adjusted so that it is playable on a high percentage of the current systems.

Oh well, maybe I will be able to play Crisis within a couple of years... by then the current $500 video cards should be in the $100-$150 range.
Crysis is not going to "look good" in XP on an X850XT either.
Posted on Reply
#7
pt
not a suicide-bomber
by: overcast
Crysis is not going to "look good" in XP on an X850XT either.
nor on my x1800gto, so say f^ck it
they ain't going to sell much if the game is so demanding
Posted on Reply
#9
pt
not a suicide-bomber
by: Batou1986
i doubt this is real
i hope it is for laughing purposes :laugh:
i want to shove this on the ppl face that sold their x1900xt/xtx/x1950, 7900gtx, etc. to buy this one, and prove how retard they we're to buy things when you don't know how it performs, they said it would run dx10, just not how :laugh: :laugh: :laugh: :laugh:
Posted on Reply
#10
overcast
I'd take these benchmarks - based on a prerelease copy of the game, a prerelease copy of drivers , on and OS that was just released today. Especially with a garbage PhysX card. With a grain of salt.

Also, they aren't going to build advanced game engines for 4 year old hardware. If you want to run new games in pretty resolutions, then you're going to have to upgrade.
Posted on Reply
#11
pt
not a suicide-bomber
by: overcast
I'd take these benchmarks - based on a prerelease copy of the game, a prerelease copy of drivers , on and OS that was just released today. Especially with a garbage PhysX card. With a grain of salt.

Also, they aren't going to build advanced game engines for 4 year old hardware. If you want to run new games in pretty resolutions, then you're going to have to upgrade.
the x800 aren't 4 year old, nor the x1800gto ;)
Posted on Reply
#13
DanTheBanjoman
SeƱor Moderator
It's still a beta using beta drivers. I would say that it runs a lot better once optimized.
Posted on Reply
#14
overcast
by: pt
the x800 aren't 4 year old, nor the x1800gto ;)
Ok so 4 generations at least :)
Posted on Reply
#15
Exceededgoku
FAKE! And I don't believe you guys believed it.... This game has been said to run fine at full with constant 60fps (in direct x 9) mode on Vista with an AMD 4800 x2 and an X1900 gpu.
And the fact that the engine doesn't support PhysX so why the hell are they talking about that for!? No screenshots.. yeah right. I'll eat my TJ07 if this is proved to be true.
Posted on Reply
#16
EastCoasthandle
by: Exceededgoku
FAKE! And I don't believe you guys believed it.... This game has been said to run fine at full with constant 60fps (in direct x 9) mode on Vista with an AMD 4800 x2 and an X1900 gpu.
And the fact that the engine doesn't support PhysX so why the hell are they talking about that for!? No screenshots.. yeah right. I'll eat my TJ07 if this is proved to be true.
But you just said DX9. That's not the same as DX10 with the extra eye candy.
Posted on Reply
#17
djbbenn
Hopefully this is fake or there's some bugs, because if it's not, it'll simply be the beginning of the demise of gaming on PCs. It will just get too expensive to run the games, and people will just turn to consoles where they're so hassle free, and far much more cheaper. It's retarded that a system with such specs couldn't even run the game at respectable frame rates at a average resolution.

-Dan
Posted on Reply
#18
overcast
by: EastCoasthandle
But you just said DX9. That's not the same as DX10 with the extra eye candy.
Considering they haven't had DX10 hardware or an OS to run it on , during the entire development of the thing. I'm going to wager it adds about as much extra eye candy, as the new shader model did in FarCry. Meaning you aren't going to miss much. I mean honestly, have you seen the demo gameplay vids? Those are DX9 and AMAZING.

We won't see true DX10 game for a couple years.
Posted on Reply
#19
overcast
by: djbbenn
Hopefully this is fake or there's some bugs, because if it's not, it'll simply be the beginning of the demise of gaming on PCs. It will just get too expensive to run the games, and people will just turn to consoles where they're so hassle free, and far much more cheaper. It's retarded that a system with such specs couldn't even run the game at respectable frame rates at a average resolution.

-Dan
Except that games are $20 more. Online fees for multiplayer, memory cards, peripherals etc etc. Things even out more than you think, considering you are locked into that hardware till the next one comes out.
Posted on Reply
#20
Jimmy 2004
Strange that there are no screenshots :wtf:

I'd like to see what it looked like at those settings as well as figures.
Posted on Reply
#21
MTL
Complete BS. First and foremost is the fact that there are no DX10 SLI drivers out there so you can add as many 8800GTX cards as you want and won't get any better performance. Next of all (and it has already been stated) Crysis does not support PhysX card at all.

How can people believe this drivel?
Posted on Reply
#22
EastCoasthandle
by: overcast
Considering they haven't had DX10 hardware or an OS to run it on , during the entire development of the thing. I'm going to wager it adds about as much extra eye candy, as the new shader model did in FarCry. Meaning you aren't going to miss much. I mean honestly, have you seen the demo gameplay vids? Those are DX9 and AMAZING.

We won't see true DX10 game for a couple years.
That made no sense, here is a link and also here that states otherwise (DX10 being used).
Posted on Reply
#23
djbbenn
by: overcast
Except that games are $20 more. Online fees for multiplayer, memory cards, peripherals etc etc. Things even out more than you think, considering you are locked into that hardware till the next one comes out.
I'm not defending consoles, I don't even have one to that matter, but my video card alone cost as much as a 360. So, the price of consoles are no where near a computer made to play games. Also you needent worry about specs. To a lot of people, that's what they want - hassle free gaming. Online gaming, well that's user dependent. Price of games, almost the same where I live, perhaps a few bucks more expensive.

If you have to spend a 1000 bucks+ on your PC every 6 months just to keep up with the games, it's going to turn people off. And with PCs also, it doesn't just end with upgrading one thing, everything now is so dependent on the other. You get a new graphics card, ohh you need a more powerful PSU to run it. You decide you upgrade your aging CPU, hmm new mobo and RAM too.

What I'm saying is, if game producers can't figure out how to make a good looking game that doesn't choke a high end system to pieces, they're going to loose sales. Loss of sales = decline in the over all market. Some places where I live have already taken their PC games section down, because they weren't selling, whilst the consoles were. Guess we'll see what the scores for this game really are when it gets released.

-Dan
Posted on Reply
#24
Exceededgoku
by: EastCoasthandle
But you just said DX9. That's not the same as DX10 with the extra eye candy.
DX10 is DX9 plus optimizations and so performance would be even greater. Which allows for more eye candy to be added. DX10 improves on DX9 by a hell of a lot and so I call BS on these benchmarks.
Posted on Reply
#25
i_am_mustang_man
this might be real
@ 16xaa
16xaf
HDR
max textures and everything
but that's to be expected i say
Posted on Reply
Add your own comment