• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Mafia II System Requirements

it works on XP, it works in 7. its not hard for nvidia to say "this doesnt work in vista"

again, you use the pointless argument about "game A doesnt work on config X" - it works, and it works fine. with the leaked beta drivers every game worked on every config.

There IS no compatibility issues, other than the one about getting two sets of drivers installed on the one system. Nvidia never said there was, so who did? when did that become the official reason? Did they email you and conveniently forget everyone else? please.

All you're doing is making things up. Where i come from we call that fiction, and/or conspiracy theories. In the context you're using it, its called being a fanboy. Heres a good summary for you:
(n) Technocratic zealots; evangelicals of geekery. Characterized by irrational advocacy of a particular OS, console, company, or franchise.

As far as you are concerned, this is all ATI's fault. Nvidia is not at fault and never will be. You are blinded to any other possibilities, because you have made up your mind long ago, regardless of any facts or information to the contrary. Its very tiring to have to keep reading the same crap over and over again.
 
it works on XP, it works in 7. its not hard for nvidia to say "this doesnt work in vista"

again, you use the pointless argument about "game A doesnt work on config X" - it works, and it works fine. with the leaked beta drivers every game worked on every config.

There IS no compatibility issues, other than the one about getting two sets of drivers installed on the one system. Nvidia never said there was, so who did? when did that become the official reason? Did they email you and conveniently forget everyone else? please.

All you're doing is making things up. Where i come from we call that fiction, and/or conspiracy theories. In the context you're using it, its called being a fanboy. Heres a good summary for you:


As far as you are concerned, this is all ATI's fault. Nvidia is not at fault and never will be. You are blinded to any other possibilities, because you have made up your mind long ago, regardless of any facts or information to the contrary. Its very tiring to have to keep reading the same crap over and over again.

They situation at where we are now is both Ati's and Nvidia's fault. But everything is a consequence of something else, and the one that first blocked PhysX on AMD was AMD. And my problem with that and reason for which I blame AMD is because they've always lied about the issue, and downplayed the feature so that now is just a gimmick. And they did so, because it was better for them, not us. And yes they are guilty of that. I don't have a real problem with them not supporting PhysX, because bussiness is bssiness, but I do have a problem with them bashing and downplaying it, all while lying about the reason for doing so. If you don't support something, all well and all, but shut your dirty mouth up, because there is people who do like it. Read the example of HDMI, they supported it and hyped it as an advantage over the competition and it's far from being open, so the reason for not supporting PhysX is complete BS.

And there IS compatibility issues. I've seen many in many forums, Nvidia and AMD forums included, and it's not something supported by Nvidia/AMD so idk what those people are thinking about. I know what you are going to say: I have tried it and it works, xxxxxx tried it and it works flawlessly (and you can probably link to 5 successful examples), etc, etc, etc, same crappy argument as always*. I have two G92 cards and know people with laptops with Nvidia mobile cards and guess what? We've never had any problem with them, but the bump issue exists. And problems with Ati+PhysX exist. Period.

Besides, Nvidia could support 2 sets of drivers or 20, yes, why not. AMD could support X1000 and even the 8500, but they don't because it cost them money and they are simply not willing to spend that money in such a "small" user base. AND THEY SOLD MILLIONS OF CARDS!! How many people would use a dedicated PhysX card?

*And I'm conscious that this is a human kind problem. I've seen the same argument being used to say that we are NOT in a crisis. "I'm well! I can pay my bills! I didn't lose my job, only my two neighbors did, but that's because they are jackasses..."
 
Last edited:
AMD never blocked anything with physX. again with making things up.

problems with physX? sure, how it crashes all the time, has annoying requirements with having hte right version instaleld to run games, and how certain nvidia drivers just dont play nice with it? of course. but none of the problems are specific to ATI.

'problems exist with ATI + nvidia. period' - please inform me of them. other than the ones nvidia forced in, i dont know of any.

yet another thread derailed with this useless crap...
 
AMD never blocked anything with physX. again with making things up.

problems with physX? sure, how it crashes all the time, has annoying requirements with having hte right version instaleld to run games, and how certain nvidia drivers just dont play nice with it? of course. but none of the problems are specific to ATI.

'problems exist with ATI + nvidia. period' - please inform me of them. other than the ones nvidia forced in, i dont know of any.

yet another thread derailed with this useless crap...

AMD did block PhysX on their hardware. They did it directly by refusing the offer and with dirty tricks after that. And yeah later Nvidia played dirty too, but neither are angels. Nvidia certainly isn't angels, but they had all the interest to push physx adoption and they offered it for free, just like they offer it for free to game developers, gives free courses at various universities, etc, etc. I fail to see how people evangelize all the PR crap that AMD spills out, and bashes everything Nvidia does. It can only be fanboism.

PhysX has never crashed on me, so yeah I've as many proofs as you have to say that it works flawlessly. If you have seen it failing is probably because you were using it along with an Ati card.

Just please stop the nonsense and confine your physX bashing to the physx bashing thread you created a while back.

And if you want to know when the thread got derailed go no further than post 4. If you want to prevent this crap from happening do your work and moderate, instead of adding your utter useless BS bashing to the fire.
 
AMD did block PhysX on their hardware. They did it directly by refusing the offer and with dirty tricks after that. And yeah later Nvidia played dirty too, but neither are angels. Nvidia certainly isn't angels, but they had all the interest to push physx adoption and they offered it for free, just like they offer it for free to game developers, gives free courses at various universities, etc, etc. I fail to see how people evangelize all the PR crap that AMD spills out, and bashes everything Nvidia does. It can only be fanboism.

PhysX has never crashed on me, so yeah I've as many proofs as you have to say that it works flawlessly. If you have seen it failing is probably because you were using it along with an Ati card.

Just please stop the nonsense and confine your physX bashing to the physx bashing thread you created a while back.

And if you want to know when the thread got derailed go no further than post 4. If you want to prevent this crap from happening do your work and moderate, instead of adding your utter useless BS bashing to the fire.

It works on leaked Beta drivers and you consistenty blame ATI? :confused:

You have been told it works by Mussels and you basically call him a liar.

Then you have the gall to define a discussion on the implications of PhysX as simply "bashing" and "BS"?

Moreover, it is impossible to discuss, and I repeat discuss, the release of a TWIMTBP game without discussing Nivida marketing practices, less so in a thread specifically concerned with the specifications of a TWIMTBP game. On this point I disagree with both you and Mussels: PhysX discussion is an essential part of any thread concerned with a Nvidia sponsored game. I think people have a right to know the full ramifications of their purchase.

Have a look a few posts back. I disagreed with Tatty. He stated his point of view and I stated mine. Do you see any further posts between us? You seem to be grinding your organ louder than anybody else; however, your personal crusade to purge Nvidia's good name and the manner in which you are going about it are more likely to get this thread locked than any supposed derailment, bashing or BS. Is that your intention?
 
You both sound like huge fanboys.lol(Mussels and Bene)

All i know is, Nvidia has a chance to make big bucks with physx and disabling it when it sees that an ATI card is present is just plan stupid. I understand there tactic though, they want Nvidia products in PC's, and Nvidia products only. I bet it would be the same situation is ATI owned physx. Sales would move up but overall in some aspects they would take a hit, if ATI puts out a great card and people are buying that for there primary card, Nvidia doesn't want to be sitting in second having a lot of there income coming from people using there cards for physx. They want to dominate the inside of your PC and i don't see anything wrong with that, even if i think it's partially stupid. It's theres and they can do whatever the hell they want with it, and if ATI doesn't like that, create your own tech and market the shit out of it.
 
Last edited:
It works on leaked Beta drivers and you consistenty blame ATI? :confused:

You have been told it works by Mussels and you basically call him a liar.

Then you have the gall to define a discussion on the implications of PhysX as simply "bashing" and "BS"?

Moreover, it is impossible to discuss, and I repeat discuss, the release of a TWIMTBP game without discussing Nivida marketing practices, less so in a thread specifically concerned with the specifications of a TWIMTBP game. On this point I disagree with both you and Mussels: PhysX discussion is an essential part of any thread concerned with a Nvidia sponsored game. I think people have a right to know the full ramifications of their purchase.

Have a look a few posts back. I disagreed with Tatty. He stated his point of view and I stated mine. Do you see any further posts between us? You seem to be grinding your organ louder than anybody else; however, your personal crusade to purge Nvidia's good name and the manner in which you are going about it are more likely to get this thread locked than any supposed derailment, bashing or BS. Is that your intention?

To work and to work without an issue is a very different thing. CPUs, GPUs and all, work on much higher clocks than they are released, but there are QA measurements that have to be taken into consideration when something is Official. Almost every single person here can take a i7 920 and put it running at 3.6 Ghz without an issue, but it's not true at all that an i7 can be run at 3.6 Ghz, speaking on generalitiess. Give it to a non enthusiast and you will have that i7 dying in a few months. It's only safe as long as the PC is taken care off, somethin that 90% of PC users don't do. 3.6 Ghz on a 920 is not a reliable official clock. It's the same with drivers. Put them in the hands of experts and you can make them work. It's false that the drivers work flawlessly, even the betas, and the hack has a lot of QA put into it (made by users for free). It's only been 5 months since support was dropped and all the games that have been tested are old games, games that were in the market before PhysX+AMD was still supported. It's going to take months until the lack of support has any effect. You can use modern GPU drivers in unsupported cards, and in most cases they work, but not always and companies are legally forced to support every posibility or drop support.

You guys can claim bias all you want because I'm defending the common sense against your desires to bash, but that won't change the fact that I'm only against those with flawed and biased opinions. I'm tired of being called a Nvidia fanboy, because I go against those who bash the company. I will always go against the BS and the bias, and it won't matter which company I am defending, but I challenge any of you to show me a thread were AMD is bashed for their mistakes and their BS (which they also have).
 
Put them in the hands of experts and you can make them work. It's false that the drivers work flawlessly, even the betas, and the hack has a lot of QA put into it (made by users for free). It's only been 5 months since support was dropped and all the games that have been tested are old games, games that were in the market before PhysX+AMD was still supported. It's going to take months until the lack of support has any effect. You can use modern GPU drivers in unsupported cards, and in most cases they work, but not always and companies are legally forced to support every posibility or drop support.

There has to be an element of truth to all of that, hence my stance on PhysX and anger with Nvidia: if it is all that, grant me the option of employing it on official drivers. I mean, surely they can simply choose to enable rather than disable it on all upcoming driver releases?

You guys can claim bias all you want because I'm defending the common sense against your desires to bash, but that won't change the fact that I'm only against those with flawed and biased opinions.

Firstly, by addressing me as "you guys", you wrongly group me into a pack with other forum users, when in fact, whilst we may agree on certain issues, we post as individuals and I represent nobody but myself. Nor is it fair to depict yourself as the lone sole bravely standing against the aforementioned pack, given that you conjured it up.

No amount of bias, flawed thinking or opinions can change one inescapable fact, and I see you use the term loosely: Nividia disable PhysX where an ATI card is detected. Your argument seems to focus exclusiively on the promotion of the stance that "disabling" and "discontinuing support" are not synonymous.


I'm tired of being called a Nvidia fanboy, because I go against those who bash the company.

I'm sure I could find several things I dislike about owning an ATI card or improvements that I would like to see, providing I was inclined to do so and assuming that I had no greater allegiance to the company other than my condition as an occassional customer.


I will always go against the BS and the bias, and it won't matter which company I am defending, but I challenge any of you to show me a thread were AMD is bashed for their mistakes and their BS (which they also have).

ATI (AMD) sometimes release crap drivers that are worse than the drivers they are supposed to substitute.

AMD probably could and should put more resources into the development of an open-source 3d engine.

AMD, like all other companies, occassionally release products that we can justifiably define as complete crap.

Well, that's hopefully the end of the last argument. ;)
 
Last edited:
There has to be an element of truth to all of that, hence my stance on PhysX and anger with Nvidia: if it is all that, grant me the option of employing it on official drivers. I mean, surely they can simply choose to enable rather than disable it on all upcoming driver releases?



Firstly, by addressing me as "you guys", you wrongly group me into a pack with other forum users, when in fact, whilst we may agree on certain issues, we post as individuals and I represent nobody but myself. Nor is it fair to depict yourself as the lone sole bravely standing against the aforementioned pack, given that you conjured it up.

No amount of bias, flawed thinking or opinions can change one inescapable fact, and I see you use the term loosley: Nividia disable PhysX where an ATI card is detected. Your argument seems to focus exclusiively on the promotion of the stance that "disabling" and "discontinuing support" are not synonymous.




I'm sure I could find several things I dislike about owning an ATI card or improvements that I would like to see, providing I was inclined to do so and assuming that I had no greater allegiance to the company other than my condition as an occassional customer.




ATI (AMD) sometimes release crap drivers that are worse than the drivers they are supposed to substitute.

AMD probably could and should put more resources into the development of an open-source 3d engine.

AMD, like all other companies, occassionally release products that we can justifiably define as complete crap.

Well, that's hopefully the end of the last argument. ;)

McC I'm sorry because I made you confuse, but I did tell you in another thread that when I write in a forum I do so everybody can see what I say. That way I don't have to replicate things (although I'm very well aware of the fact that I do repeat things a lot, just don't shoot me for doing it AND for tryig to avoid it too :laugh:). I'm just asking you to take that into consideration and that you don't think that everything is directed at you, and only you, just because I quoted you above.

All that is my fault anyway, I know, but just try to understand how I am and our conversations will be better and easier. :toast:

So, regarding the bias read Mussels posts and read the comment on CDdude's post, etc.


@Everyone interested :D

^^ I'll try to make such a differentiation when talking to everybody.

Regarding the support. I think you guys have a very limited knowledge of what supporting something mean, yet many still blame bad drivers and poor game optimization, meaning they do know the effects of bad support. If QA (on anything) was as easy and cheap as many of you guys think it is, don't you think that every company would just make it always and properly in order to avoid all the problems they have later? If bad optimization QA happens is because QA is one very expensive and time consuming enterprise. Big enough that it seems to make sense (as a bussiness) to many comanies (and game developers) to risk 3 years of hard work and developing time into that last key in the chain. Bad QA can ruin your product, and yet it does happen that QA ruins some products. And it's also fact that most of those products get fixed afterwards, meaning they did something that could have been done before, but they didn't because what if it works without spending all that much on QA? Let's try with a poor QA and see what happens.

Well, now especifically talking about PhysX, just because something works with older hardware and older games, that doesn't ensure that it will work with future releases. Future releases means games and also means Southern/Northern islands, GF104, GF106... Is this so difficult to understand? AMD will never disclose details about upcoming releases to Nvidia, much less give them working samples (and they will not do QA themselves), so ensuring a quality experience since launch day is not posible and they decided it was time to stop wasting money. There is a time in which you have to stop supporting something if you are ever going to stop supporting it. And supporting PhysX on non Nvidia GPUs just became too expensive. I keep hearing why Nvidia dropped support but I already said why. The inherent question I made when I mentioned AMD's dropped support of X1000 and older cards remains to be answered. Why did AMD drop the support? it's simple and same same thing that I keep saying every odd line of my posts here. Supporting anything needs money and AMD couldn't or didn't want to "waste" money supporting those cards and assuring that those cards work with sufficient quality. Same for PhysX.

And now, it's clear that support is going to be dropped because it doesn't make sense as a bussiness practice (take into account that PhysX is free, you pay absolutely nothing for it, if you think you pay a premium please take a look at the GTX460 and think again, remember the 8800gt and think again).

Yeah, you dropped it officially and you have two options you keep the feature there and let's see what happens or you disable it. Seeing how people reacted to the beta, it's clear that keeping the feature is translated into people's minds as "there is support" and it's later when complaints come, so they just disabled it, and yes hurting the competition is always a constant, but it's not the only or major reason to drop the support.


EDIT: Damn I don't know why my point is so hard for you to understand. I'm talking to very young people or is it that everybody has short memories. The days in which you had to look carefully for which memory worked with which MB are not so far in my memory...
 
Last edited:
I know I'm looking forward to this game! :) This is one of the only games I'm somewhat excited about.. which is kind of sad.
 
I know I'm looking forward to this game! :) This is one of the only games I'm somewhat excited about.. which is kind of sad.

I loved Mafia so I'm awaiting this one too. And I'm in the sme situation since apart from this game I only care about Rage and Crysis 2 both of which are 2011 games most probably. Uh I'm curious about Deus Ex too but idem, I think it's 2011 material too.
 
I was thinking about the GT240 as a second card for Physx and i don't know if it is worth it? Like doing all the driver tweaks just for a couple of games. I am considering to get me a GT240 1GB DDR3 but dont know if it is worth doing it? :ohwell:
 
Get a GTX 460 for physx
 
Damn I don't know why my point is so hard for you to understand. I'm talking to very young people or is it that everybody has short memories.

I understand, but I disagree. Do you consider 37 young? In any event, I am old enough to know that we have both said more than enough on the issue.

This game, like all other TWIMTBP games raises a number of questions that require answers:

a) What does PhysX provide in this game, what exactly does it entail?

b) Will PhysX impact performance and to what extent?

c) Does this game require a dedicated PhysX card? Is it worth it?

d) Is it worth the effort and expense to buy a dedicated PhysX card if I use an ATI card?

We have each addressed some of these questions, but we can only provide opinions, never facts or irrefutable arguments. Others must answer these questions for themselves, we have said enough and there is no need for repetition.
 
Get a GTX 460 for physx

Utter waste of money. Unless you're just a benching monkey.


Mr McC said:
This game, like all other TWIMTBP games raises a number of questions that require answers:

a) What does PhysX provide in this game, what exactly does it entail?

b) Will PhysX impact performance and to what extent?

c) Does this game require a dedicated PhysX card? Is it worth it?

d) Is it worth the effort and expense to buy a dedicated PhysX card if I use an ATI card?

I'm taking educated guesses here like everyone else:

a) See Batman Arkham Asylum... Smoke, bricks.. physics stuff.

b) and c) It will require a card if you want to use PhysX. Without the card you will lose the PhysX features. Performance won't be impacted either way really.

c) It's up to you. I have one (GT240) and I don't really use it much. I fold on it occasionally. PhysX titles that use a dedicated card are pretty few, then again once I heard that Mafia II was using PhysX, I bucked up and bought the GT240. If I had to decide to buy a PhysX card again would I? Yes, as the state of PC gaming is what it is. Nvidia has a proprietary physics system that some developers use. I would never buy a higher or even middle of the road card for PhysX as it's a waste. This GT240 is a very good card, 10w power consumption at idle (which is what it does most of the time). Plus having an Nvidia card in your system along with ATi is nice for those encoding programs and other apps that use one or the other.
 
Last edited:
Their decision on the sky high requirements should be reversed and changed.

the requirements are very high and no body can enjoy the physix and graphics to the fullest. this cant be right.

we should complain and bycott their games. thats what im going to do.

Nvidia just gave them money or bribed them in order to make the requirements high like that. just like Crysis.
 
Their decision on the sky high requirements should be reversed and changed.

the requirements are very high and no body can enjoy the physix and graphics to the fullest. this cant be right.

we should complain and bycott their games. thats what im going to do.

Nvidia just gave them money or bribed them in order to make the requirements high like that. just like Crysis.

I'm going to boycott sand, the key component to silicon. Might as well start at the source, right? :toast:

For PhysX a high end card isn't needed. Don't listen to marketing, be informed.
 
erocker: yes. we SHOULD respond to their evil decision and evil thoughts about the cruel requirements of the game.

just because they are the developer of the game, it doesnt give them the right to whatever the hell they want to do.

i think the best way to hurt them financially and emotionally, its best to not buy their games and instead pirate them. how is that? :)

i want to hurt their sales.
 
Their decision on the sky high requirements should be reversed and changed.

the requirements are very high and no body can enjoy the physix and graphics to the fullest. this cant be right.

we should complain and bycott their games. thats what im going to do.

Nvidia just gave them money or bribed them in order to make the requirements high like that. just like Crysis.

Keep in mind that system requirements are just a general idea on where to aim, it's really not something to take literally. Also the high system requirements aren't that bad a 9800 GTX or 3870, and a some slow quad core, how is that to high?. If you're talking about physx requirements then as i said above, don't take them to literally as they generally under or over rate things. Do you seriously think someone with a Pentium D and an 8600 will play this game sufficiently?, hells no, it's just there as a base but not something to be taken seriously.

And you're gonna boycott there games just because you have a system that can't play there games?, makes no sense.

And oh my god, please tell me why Nvidia would bride a game dev to make there games with high system requirements. It's just marketing, Nvidia slaps there name on the game for advertising and more money, how do they decide what the system specs are gonna be?(the only aspects they decide are the physx side of things, if they want it in a game there gonna pay for it to be a featured technology in said game.)

i think the best way to hurt them financially and emotionally, its best to not buy their games and instead pirate them. how is that? :)

:twitch::shadedshu
 
Aww cmon already. Stop with the NV hate. LEARN TO LIVE WITH IT INSTEAD OF COMPlAINING ALL ThE TIME. It was YOUR choice to buy ATI so you knew that you would lack stuff like physx and cuda. And its stupid to not buy a game just because it uses physx while you actually waited, and waited for the game. And i see it has become an awesome game. Even without the neat extras of nvidia. I wont buy it because i still need to buy TDU2 and crysis 2 :P

Your complaining (not just you but in general) will not help
Anyone. It will be like this for a long time accept it. And if you wont buy any games with physx goahead, dont. But youll be missing some good titles that are great without physx.

most people purchased ATi cards because pricewise they outperform NV cards 4870X2 goes for $200 used and will beat a 5850 or GTX470 in almost every game. cuda is not something i will miss having nothing utilizes it so thats a mute point thanks for reading your nv box out loud. physx should be possible on ATi based systems GT240 sales would go up if in windows 7 i could run my 2x4870X2's and a GT240 performance on pair with dual GTX480's and physx to boot for the cost of one GTX480 or one 5870.

Whenever my pc hits minimum, im upgrading.

now thats a lie :nutkick:
 
thanks erocker :). im working at it and post here when i finished the petition.

in my petition, i will tell them to remove the Physix from their game or at least reduce the requirements for it if they dont want to face low sales and piracy.

if they dont, they will alienate their customers by optimising their games only for Nvidia cards and they will face piracy just like Crytek did for their Crysis game.
 
erocker: yes. we SHOULD respond to their evil decision and evil thoughts about the cruel requirements of the game.

just because they are the developer of the game, it doesnt give them the right to whatever the hell they want to do.

i think the best way to hurt them financially and emotionally, its best to not buy their games and instead pirate them. how is that? :)

i want to hurt their sales.

upset becouse u cant afford the hardware to run the game at max settings? thats what it boils down to..........

imo, more games should be created with awsome graphics that cannot even be maxed with a single GTX470 or 5850. why? a few years later when you either buy the game for the first time or replay the game it will be a lot better then when it first came out graphic wise :P




if you were a developer of a game... wouldnt you like to make the game HOW EVER THE FUCK YOU WANTED TO? its your game. no one else has to play or buy it.


should Masterfoods Corp, stop making Snickers and only sell mars bars? not everyone can eat the nuts in snickers and not everyone likes the extra flaver.

no Masterfoods Corp, should be able to make any bar they want to.





(disclaimer, Google siad Masterfoods Corporation makes mars and snickers bars. dont blame me if they dont make them at all.)
EDIT:
i think its acutaly called Mars Incorporated. :P
thanks erocker :). im working at it and post here when i finished the petition.

in my petition, i will tell them to remove the Physix from their game or at least reduce the requirements for it if they dont want to face low sales and piracy.

if they dont, they will alienate their customers by optimising their games only for Nvidia cards and they will face piracy just like Crytek did for their Crysis game.

you can turn off Physx hardware based support and it will reduce it to CPU level.... please go and read up on the difference.
you dont get all the same features with CPU vs GPU and it dosnt requier as much "horse power" to run.


i think your just upset that your ATI only Pc cant run it maxed out.
 
erocker: yes. we SHOULD respond to their evil decision and evil thoughts about the cruel requirements of the game.

just because they are the developer of the game, it doesnt give them the right to whatever the hell they want to do.

i think the best way to hurt them financially and emotionally, its best to not buy their games and instead pirate them. how is that? :)

i want to hurt their sales.

Steady on, this is a company not Satan, "evil" and "cruel" may be going a bit far, unless you have downed a few beers whilst watching "money as debt".

With regards hurting their sales, your logic is flawed: you were not going to buy the game in the first place, whereby your act of piracy does not equate to a lost sale. Your heroic act of defiance will simply show up as an additional entry on the download statistics taken from some server that will be brandished by some corporate moron whilst he attempts to justify the need for additional DRM.
 
if you were a developer of a game... wouldnt you like to make the game HOW EVER THE FUCK YOU WANTED TO? its your game. no one else has to play or buy it.


but they didnt make it how they wanted its a marketing ploy that will have a huge TWIMTBP logo when they game starts up. if ATi dumped shit tons of money into game developers the requirements would say ati cards instead its just one more NV biased game.

haha its alot like when the string on an AMD cpu is change to genuine intel its mystically gains performance in certain compilers
source 1 source 2
 
Back
Top