• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Crysis doesn't push PC to it's limit?!!?

Heat = Load, lower load lower heat. If you gpu temp is higher than normal and your cpu temp is lower its because they are being used more/less. duh!

lol, state the bleedin obvious, I KNOW THAT. Im not stupid

Just one would assume that such a high end game would utilise more of both, the GPU and the CPU. I know theGPU is working hard as, but the CPU deffo plays an important role (obvious) and consiquently It would have appeared logical to assume a greater, or high level of useage on the cpu also. I do realise that with ever increasingly powerful GPU's, more burden is placed upon them when playing a game, just didnt realise how much Crysis lobbed at the GPU against the CPU compared to other hi end games
 
After seeing a lot of the demo reviews and reading the forums. It seems that the GPU is the biggest limiting factor right now. But in all that reading, I came across this article--> http://www.shacknews.com/featuredarticle.x?id=639

Here is the main thing that stood out to me:

"Shack: What is the main limiter for Crysis in terms of GPU, CPU, or RAM? If users are near the low end of the requirements, which should they upgrade first?

Cevat Yerli: We would say first CPU, then GPU, then memory. But it must be in balance. If you are balanced, we are more CPU bound then GPU, but at the same time at higher CPU configurations we scale very well for GPUs."


So like many have said, the game is in BETA and by Christmas things should look better.

For me, I ran both benches. @ 1920 x 1200, High all settings, no AA. CPU bench = 25fps GPU bench = 27fps. No overclocks on GPU (RivaTuner not playing well with 169.01 driver) I would expect with GPU overclock I could see 30-35fps. Played the demo in very high first. Looked amazing just couldn't hit the broad side of a barn. Later realized I had AA set to 8x. Turned that off and it was much better, but still not the smoothest.

The 169.01 driver did make a huge difference in the MP beta though. 1920 x 1200, High all settings runs smoothly, around 45fps
 
forget crysis. I had a bad enough experience with Battlefield2142, i'd rather not take my chances. This game runs like piss on my rig, where Orange Box runs Max fps at max graphical settings.

If Crytek had such great CPU written code, then it would actually make use of my CPU during the CPU test.
 
Ok. WTF. I'm rendering a 3D image in Bryce 5. It's a free program for crying out loud. 8000x3000 resolution. All 4 cores of my Q6600 are being used to assist my GPU in the rendering. CRYSIS is fucked. It's terribly coded. It doesn't utilize shit. Keep in mind Bryce 5 is FREE! http://www.daz3d.com
 
Well, I hope that your assumptions treat you well. I think we can take a lot from this demo, but it's like your system. You didn't know how far it was going to OC or perform when you purchased all the components pre-build, you had an idea of how well and maybe a goal or two. That's how I see this demo, it's there, we see it, and have a good idea of how it may be upon official release, but it's not the official release, it's a taste. I don't think assuming that just cause the demo doesn't support 4 cores, that the official game won't, I mean it's getting to be pretty standard that programs that need performance, a lot of CPU power, etc have some kind of SMP support. It'd be a HUGE mistake if they overlooked that IMO.

I will wait until release to make my opinions of the game, but that's cool if you feel that strongly about it, sorry it didn't go your way.

Final note, it plays decent on my rig at 1440x900, but no super hi, no dx10, about 50/50 with hi/medium settings, getting ~26fps avg, and really at that low of FPS I'm impressed how well the game plays, there are some games in this genre that would be annoying or worth losing some visuals just to play it. This game pulls off a good feat by creating a smooth playing, well animated game that doesn't need 60FPS to do a good job. That alone makes this game very impressive, even if it doesn't use my whole system, oh well it's only a demo atm.

I look forward to it's release, as-well-as a few others, I'm sure it will be a good gaming experience for those that enjoy this type of game and know what Crysis has to offer.

:toast:
 
Ok. WTF. I'm rendering a 3D image in Bryce 5. It's a free program for crying out loud. 8000x3000 resolution. All 4 cores of my Q6600 are being used to assist my GPU in the rendering. CRYSIS is fucked. It's terribly coded. It doesn't utilize shit. Keep in mind Bryce 5 is FREE! http://www.daz3d.com

You really have no idea what you are talking about, plain and simple. The 2900 is not that good of a card in games. That has been proven time and again. It may have 3dm06 records, but it will not top a gtx, gt, or ultra in games consistantly. You are limited by your 2900, hell you could be running a 6300 at stock and you may not even see a performance difference in the demo. If you are hung up on your video card (which you clearly are, yet you do not seem to understand) It does not matter what your cpu is as long as it can tow its weight.

This demo does NOT use much in terms of cpu power. AI, Physics, geometry...thats about all the cpu has to do. When you have a solid dual core, or quad in your case, that is a cakewalk. The gpu on the other hand has, rendering polygons, HDR, AA, AF, reflections, volumetric clouds, post processing effects (aura's metal shine), feild of focus (when you go to sights, and textures. That is a whole lot of shit for a single unit to do, let alone at high settings. Turn half of the extras off and watch your FPS increase two fold. In the mean time, quit whining, get a life, and learn a bit about the games you seem to care so much about.
 
You really have no idea what you are talking about, plain and simple. The 2900 is not that good of a card in games. That has been proven time and again. It may have 3dm06 records, but it will not top a gtx, gt, or ultra in games consistantly.

OK - wtf fanboy??? I never said 2900 was the best card. I said it's ONE OF THE BEST.... and it IS one of the best. It beats my 7950GX2 by quite a bit. The issue here is Crysis - not NVidia or ATI. The game runs like shit on both cards. The game doesn't use multiple cores. The game doesn't use SLI. Get a life, bozo.
 
BTW Kenny. Like I said in my previous post, I was using Bryce 5 to render an 8000x3000 image. It uses Depth of Field (which you tried to explain to me was called field of focus) HDR, AA, AF, Reflections, Volumetric clouds (a fukload of them), post processing effects, a FUCKLOAD of polygons, and Holy mother of god high res textures. If all of that is strictly done by the GPU (like you claim), then why in the hell was my CPU at 100% all four cores?

It's because Bryce was utilizing unused processing power to render the image. My 2900 Doesn't suck whatsoever. It may not consistantly beat 8800s, but it's better than the best non-dx10 card (7950gx2).

I don't know why you pissed me off so much. You're just a cult fanboy nerd running his mouth on the internet. Crysis is fucked. You know it. Many people will agree. You can try to explain to me all you want how it's ok for a CPU benchmark to not get even 1 core passed 70%, but I'll still think you're full of piss.
 
You do know that when you render a picture, you dosent use the gpu? Its pure cpu power =) In a game they can't archive decent speeds with cpu power, so they use a gpu to speed things up =)
 
You do know that when you render a picture, you dosent use the gpu? Its pure cpu power =) In a game they can't archive decent speeds with cpu power, so they use a gpu to speed things up =)

You're being sarcastic.... right? I mean... seriously... you gotta be kidding.... right? No.... seriously.... you don't seriously....

...:wtf:
 
Thats how it works. A render server usualy consist of a lowend gpu with lots and lots of cpu's in it =) The more cpu power, the faster it will render.
 
I think you have Web Servers and Render Farms confused.

http://www.nvidia.com/page/quadroplex_comparison_chart.html

Check the last one.... The S4 Graphics Server... It uses 4 GPUs. These are the kind of things used in making Toy Story type movies.

They don't call them graphics processors for nothing.
 
They are mostly used for CAD. I only know of 2 renders that can use the gpu - Renderman and Gelato. And i dont think your using one of those

Anyway - I think they intended the game to run at about 25-40 fps. So they dont actualy have to use more cpu power and stress the componets more.
 
Last edited:
somethings in the demo are not YET supported like Quad SLI i think that the real retail is gona shine and use ur system in its whole
 
You really have no idea what you are talking about, plain and simple. The 2900 is not that good of a card in games. That has been proven time and again. It may have 3dm06 records, but it will not top a gtx, gt, or ultra in games consistantly. You are limited by your 2900, hell you could be running a 6300 at stock and you may not even see a performance difference in the demo. If you are hung up on your video card (which you clearly are, yet you do not seem to understand) It does not matter what your cpu is as long as it can tow its weight.

woah.. mate, Ive proven that the 2900 is good at gaming, lol. just look at the fps i was getting compared to those from gtx, ultra. its pretty close to them, and beats the gts. The gtx etx MAY score a few fps more, but stating the 2900 is not that good of a card is idiotic...:roll:

im running 1920x1200, obj detail, tex and something else on v high, rest on high and its playable. even with my shite cpu. So really kenny, plz dont make urself sound silly, as its been proven IT IS a good card, not the best, but once overclocked it rocks
 
Helvetica lets just wait for the release dude. i like to see how well it pushes the hardware also.
 
@Helvetica

Please change your tone, I'm not liking your language right now.


As for your argument, you can't compare rendering with a game. And if you really want to do so, check GPU load in Bryce while doing your render, you'll get the same result but the other way around.
 
When I start a render, my 2900's Fan goes full blast. How can I check GPU load?
 
@Helvetica

Please change your tone, I'm not liking your language right now.


As for your argument, you can't compare rendering with a game. And if you really want to do so, check GPU load in Bryce while doing your render, you'll get the same result but the other way around.

i vote ban :rockout:
 
i vote ban :rockout:

I vote everyone plz calm down. No need to get so angered about a thread on crysis.

*Radiates peaceful go game and chill out vibes
 
you dont have much hair left do you? I bet you have already pulled it all out and about to have your third heart attack
 
i just wanted to put my input into this thread by adding a little quote kk?


Crysis doesn't push PC to it's limit?!!?

O NOES!!!! :eek:
 

Thing is morgoth, rendering programs are not programed for FPS for one, just getting it done as fast as possible. They also do not have AI, physics, etc to calculate in rendering. Rendering programs make use of the cpu power as otherwise it would just sit there, and make customers unhappy.

Games such as crysis have cpu tasks that are totally seperate from the gpu side of things. There would be no point in coding to make use of the cpu either, as the average gamer has say a 2.4ghz amd dual core or 1.86ghz c2d. By the time the game finished calculating the cpu only tasks, there would be little time left for helping out the gpu, and thats not even mentioning the latencies involved with it.

Gamers looking for good fps usually go in the order of new gpu, then cpu or ram, then whatever is left. The one commonality is the thought of gpu first. It would be a waste of time and money to code cpu dependent (look at the failure of supreme commander) vs gpu. Few people have the cash for high end quad cores and 8800gtx sli (remember also, they code for stock components) so they will take the route with best performance for most.
 
Back
Top