• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Ubisoft's Far Cry 2 System Requirements Published

go play Star Trek Borg on a modern computer it won't even start

or COD:UO on a modern computer, the game hitchs and runs badly.

run 3dmark99 on a modern machine, i have and my score isnt much better than 10k explain that.

Hexen II i must try
 
It will tweak out as soon as you try to look at water. It runs...a little worse than crysis on my computer (Hexin @ 800x600, Crysis at 1680x1050 High, 0AA)
 
There is no way 8600gts could handle farcry 2 effects,
1024x768 super low with no AA & AF
 
the screenshots i have seen did not impress me nor did the graphics. Looks like Stalker in sunlight to be totally honest.
 
Yep and i might be able to run it at 800x600 with bugga all detail.
 
think logicly, this is a massive game world, so if they expect it be playable textures ect can't be terribly high unless they use alot of z-optimizations in drivers and in the game
 
TBH I don't know why do you have to argue with MY OPINION. You made clear (before this last post, here you just repeated the same with different words) that FOR YOU, recommended requirements shown on actual games offer a satisfactory gameplay at satisfactory settings. You also made clear that you consider low settings on low requirements hardwre in recent games something "playable" or may I say, worth playing. I DO NOT (aplicble to both concepts) and I know many if not most of the active people in these forums think the same. What are you trying to demostrate? Whether a setting is acceptable or not is not something objective, is subjective to each player. There is no absolute setting, frame rate, etc. that can be named as acceptable and such there is no absolute hardware requirement that could fit in such low and recommended definitions. 640x480 @ 20 fps could be acceptable for some, but not for me, that's for sure and by a great margin. For me gaming anything below 1024x768 @ 30 fps directly doesn't exist. And same happens with low settings. In the past low settings was something worth trying, today all games look like crap on low, it's not acceptable, FOR ME, remember I'm giving my opinion. Medium settings are devaluated too in comparison to what medium meant in the past.

And just to finish I want to say that I do have old hardware, so I know which settings are playable on what hardware and what not. But again since it's my opinion, I choose that anything below 30 fps average is unnaceptable, anything below 1024x768 is unnaceptable and low settings as a whole are unnaceptable. Period.
I agree here, although I am even more strict. I won't play a game at all if I can't run it maxed out completely w/ 0XAA/16AF 1920x1200 @ at least 40fps average (I'll accept 30fps for Crysis, as it seems to run smoother at lower rates). If I have to, I'll go down to 1440x900 4xAA/16xAF that my second rig offers, but all eye candy must still be on. I'll sacrifice some AA, but I won't ever sacrifice eye candy.

To me, recommended means I should be able to max it out at the most common resolutions available at the time of release.

As such, I still haven't played Crysis for more than 5 minutes. I'll try it again with a couple of OCed 4850's when I get them.
 
my requirements because since i got my 6800GT in 2005 are similar, I have to have at least 4xAA and 8xAF @ 12x10 high or i simply don't enjoy a game, one reason me and Crysis never got along, there are jaggies if you look, and the textures blur far to close.

that and my name isn't Dave so it wouldn't take as much affinty to me as it would a dave
 
I agree here, although I am even more strict. I won't play a game at all if I can't run it maxed out completely w/ 0XAA/16AF 1920x1200 @ at least 40fps average (I'll accept 30fps for Crysis, as it seems to run smoother at lower rates). If I have to, I'll go down to 1440x900 4xAA/16xAF that my second rig offers, but all eye candy must still be on. I'll sacrifice some AA, but I won't ever sacrifice eye candy.

To me, recommended means I should be able to max it out at the most common resolutions available at the time of release.

As such, I still haven't played Crysis for more than 5 minutes. I'll try it again with a couple of OCed 4850's when I get them.

I have the same requirements but 1680x1050 instead of your 1920x1200.
 
hehe we all have our requirements im still CRT so 12x10 is mine, is why getting the uber card setups never did me much good
 
lol on this "old hardware" have you even played crysis because a lot of people with respectable systems are quite content with playing at 1024x768 and 30fps, cause thats all that crysis offers, but in our minds thats wholly acceptable, and what system are you on about cause the one in your specs shows us a x2 4800+, 2gb ram, a raptor and a 8800gt (please correct me if this is incorrect :D) yeah thats really old hardware :laugh:

i find it hard that you dont even understand my posts, regarding system requirments, to me its quite simple. let me break it down for you:

minimum = you can run without any eye candy and at a low res, with high enough framerate to PLAY the game (which is the point is it not ????)

reccomended = you can play the game with acceptable eye candy and at decent res, also with high enough framerate to PLAY the game.

this unfortunately is not a forum based entirely on YOUR opinion, this is a public forum, and unfortunately things dont always go your way, what you call "acceptable" is entirely different to what other people think, and luckily for us we base things in fact here not opinion

Don't be an arse.

As I've already said, minimum specifications are NOT your opinion, they are not Dark Matters or anybody elses opinion, they are what the developers feel is the minimum. Before anyone has a go, what I mean is the cold-hard specifications written on the back of the box are what the developers think are the minimum/recommended etc.

But every person has a different opinion as to what they will accept as minimum, and these can differ wildly from what the developers put. As Wile E has pointed out, he expects a pretty high-standard for his minimum, which is fair enough, there are a lot of games that can play at that standard, but there are some he can't.

Me? All I want is 1680x1050 and high settings that is smooth.
 
My minimum for playing a game is it turning on. To be honest and imo I really don't care what the game looks like as long as it is a good game to play which is my basis for being a gamer. I do happen to get the latest hardware because for once I have the chance for once to have the best of the best since I didn't have that chance when I was younger but now that I have a job I can buy what I want.
 
minimum = you can run without any eye candy and at a low res, with high enough framerate to PLAY the game (which is the point is it not ????)

reccomended = you can play the game with acceptable eye candy and at decent res, also with high enough framerate to PLAY the game.

Look Dude.

Minimum Specs = Game runs like utter crap not worth playing.

Recommended Specs = Game runs acceptably but only just, nothing to get excited about.


I always tend to have a fairly up to date system but will not touch resource hogging games like Crysis and possibly Far Cry 2 as I would much rather play older games maxed out to the hilt than modern ones with compromised settings.

In a FPS game (and lets face it, in most other games) a solid 60 FPS is what is really required both for the enjoyment of a high quality gaming experience and in the name of preserving your eyeballs. I think games developers should start being a bit more honest with their 'specifications' aiming for thier listed recommned specs to reflect mostly 60FPS at medium settings and reasonable resolution. If they are worried about frightening away a large section of their potential custom then perhaps they should just lower the performance bar of their games so that a wider span of the worlds home PC rigs can handle their software.
 
Thanks Darknova.

@mullered07

You are acting like a moron. Not saying that you are, but you are more dedicated to "win" this "battle" than understanding what the rest are trying to tell you. A battle that really doesn't exist as it's only one sided, yours. What you don't realise is that you are arguing with OPINIONS, not facts, and teling us that our opinions are wrong. Neither you or the developers can say what the minimum or acceptable gameplay is. We are saying (at least I am) that for us, the settings that the recommendations in the box can guaratee nowadays are not acceptable. YOU MADE CLEAR, PLESE DON'T WRITE IT AGAIN, THAT FOR YOU THOSE ARE ACCEPTEBLE, IT'S NOT FOR US.

All that you are saying besides that, is that because you accept those settings those are the absolute requirements for a game. They aren't, regardless of it is you or the developers themselves who say they are. I'll put a simple example: next ID games, Doom4 and Rage. According to Carmack, requirements for both games will be very similar, but because of the different nature of the game at Rage they will aim at 60 fps, while they will aim at 30 fps for Doom4. See, same requirements different settings. They are deciding what "playable" or "enjoyable" is long before they launch the game, but that means nothing, as is the people who are going to play them who will decide in the end, based on their opinions. That's what has happened with Crysis after all. I did enjoy Crysis a lot and I think that the requirements were acceptable for what it offers. You will have a hard time finding any member that has defended that game more than me, ask others if you don't believe me. But you DO need a lot better PC than what the requirements suggest to play that game. And even though not as pronounced, that's what happens with almost all other high profile games.

Just to satisfy your curiosity I have the following computers (I will list those of my brother, dad and uncle too, as I have access to them and I do play on them a lot, specially for testing games):

1- The one on the specs. It's going to be replaced by a Q6600, P45, 4GB DDR2 really soon. Already have the components, I have to only gather strenght to put it all together.

2- AMD 64 3700+, 2 GB ddr400, 7900 GTX on cheap asrock dual. Going to be replaced by the current PC except the graphics card.

3- P4 2.5 Ghz, 2 GB DDR 400, 6800 GT on a SI655 mobo. Going to replace it too with the spare pieces of the above.

4- Athlon Thunderbird 1000 Mhz, 1GB DDR 266, Ati 9600 Pro. I have it on my town.

5- Asus laptop. 1.6 Ghz Core2, 2 GB DDR2 667, HD 2400.

Now other's PCs:

6- Brother's PC. Athlon X2 4200+, 2GB DDR 400, X1900XT.

7- Dad's PC. Pentium D 950, 8600 GT, 2GB DDR2 667.

8- Uncle 1's PC. Dual Xeon 2.8 Ghz, 2 GB RAMBUS, X850 XT.

9- Uncle 2's PC. Penium D 930, 2 GB DDR2 667, 8400 GS.

As you can see, I have a wide range of PCs to test on. I can test on many others ranging from a 486, to a PIII, that are also active and within 3 Km from my home used by my grandma and aunts.
 
The specs seem kind of odd to me.

Why require a 6800, but only an x1650 on the ATi side? Wasn't the 6600GT about the same performance wise to the x1650?

And why make the x1900 the recommended on the ATi side, but make it a generation higher on the nVidia side? Why not just make the recommended a 7900GT or something around the same performance as the x1900 from the same generation?

X1650 was 7600GT rival and 7600GT was better than 6800 and for the second question we can say that on the market it can be found both x1900 and 8600GTS cause 7900GT has disappeared...
 
wow , seems good , i think far cry2 engine better than crisis engine , i here they working too hard on it , also i seen one of tomshardware team go to e3 game show and see far cray2 run on big screen or maybe not too much big , but it's run in best quality and performance on 2*8800 ultra
 
As long as it looks pretty and can actually run on something other than the latest and greatest hardware ill be happy *fingers crossed*.
 
X1650 was 7600GT rival and 7600GT was better than 6800 and for the second question we can say that on the market it can be found both x1900 and 8600GTS cause 7900GT has disappeared...

Then why did they not say ATi HD2600 vs Nvidia 8600GTS? they are the same generation and meant to compete in performance.
 
Then why did they not say ATi HD2600 vs Nvidia 8600GTS? they are the same generation and meant to compete in performance.

Because the HD2600 cards suck. The ATI 8600GTS is about equal to the X1900GT, that's why they mention the X1900GT and not the HD2600.

And it's 100% normal that they say Geforce 6800GT or ATI 1650Pro, because overall they have the same performance...

The X1650 was faster then the 7600GT in the beginning, but after the Nvidia forceware 77.30 release Nvidia was faster again.
Nvidia released the 7900GS because the X1650 was winning against the 7600GT in most benchmarks ( in the beginning at least ).
 
Because the HD2600 cards suck. The ATI 8600GTS is about equal to the X1900GT, that's why they mention the X1900GT and not the HD2600.

They did not mention a X1900GT for minimum (that was recommended) they mentioned a X1650.

100% normal that they say Geforce 6800GT or ATI 1650Pro, because overall they have the same performance...

They do have the same performance but it is strange because the 6800's were released with the x800 series of cards not the x1x00 series, one was released as a high end card and one was released as a midrange. Thats not 100% normal.
 
X800 is SHADER MODEL 2.5! FarCry2 probably is shader model 3 so the X800 cant even support it!
 
When did any core 2 duo chip equal a 5200x2? Makes c2d chips look better than they are.

Most lower end Core2 E6x00's and E4x00's at stock are on par with similar clocked AMD chips.
 
X800 is SHADER MODEL 2.5! FarCry2 probably is shader model 3 so the X800 cant even support it!

Im just making the point its not 100% normal to compare a midrange (x1650) card to a highend (6800) card thats a generation older.
 
Back
Top