Discussion in 'News' started by btarunr, Aug 28, 2008.
I have Windows XP SP3
This is our Russian site download images ...
Shut-down, Connect the monitor to the 9600 GSO, start, let the OS detect the display and configure as a second display-head, shut down, connect the monitor back to the HD 4850, start up. So the OS is fooled into thinking there are display-heads configured for both adapters.
I only did that already did ... And even dances with diamonds - nothing helps
ктонить в курсе када на радеонку 4870 будут такие дравишки?)))
Oh well :/
Someone will discover how it works eventually.
everyone here think it is good point for ati , for me i think not , maybe good for users but not for ati , cuz that mean every pc must put in nvidia card in any situation primary or physics , ati must develop a software to solve this biggest problem
hence Havoc Engine, Havoc Engine has been around alot longer so its further ahead in the Physics Dept.
Havok's original engine only used the CPU for physics, and then much later after AGEIA came around did they start with a engine that used the GPU, however that was a separate licence that gamedevelopers had to opt for (and pay for) so adoption wasn't very big I think, it's hard to say because nobody knows when people/companies say 'havok physics' if it's using the old CPU licence/SDK, like HL2 does for instance, or their (relatively)newer GPU one, plus I think their GPU one was partly non-interactive, mostly just visual wasn't it? I'm not sure about the details of it.
But either way, if a game uses the one developed by AGEIA, PhysX, like the unreal3 engine, then it doesn't matter if your card has great HAVOK support since it's PhysX that the game requires. And right now I bet lots of developers are opting for the PhysX one since it suddenly has a lot of people that can use it, unless of course they are smart like the crysis makers and just make their own physics engine and bypass all the hassle Although doing that on the GPU might be harder to develop than you think *shrug*
i have a board with 3 full length pcie slots, 4870x2 and 4870 in CFX and i am thinking about getting a 9800gt for physx. the reason for the 9800gt is because it's the fastest single slot card i can think of and it will work perfectly in between my 2 ATI beasts.
is this a good move for me or should i do something else with my time/money?
The performance of PhysX isn't all that proportional to the GPU computational power beyond maybe a 8800 GS 384MB. IIRC the third long slot is PCI-E x4, is it?
4x when 3 cards are in yes. 8x when only 2 are used.
i can get a 9800gt for like $40 so it's not a price thing... just availability.
It also becomes a heat and power-draw thing
With 3 R770 on his system, I don't think the power/heat of a 9800GT is an issue for him.
Point is, those RV770s crunch graphics, but I don't think choosing a 9800 GT over a 8800 GS would translate to anything better than higher power draw than what it already is.
I agree, but the extra power required to go from ther GS to the GT is NOTHING compared to the power draw he already has. Sure it is pointlees if he will get the same performance, but we don't really know how much power it's going to be required in the next 6 months = 50+ new titles.
And then it's the $40 argument. If he can get a 9800GT for that money I would never never tell him to get something slower. He could even had to pay more for a GS!!
I don't think those 50+ titles have PhysX content that would make a dedicated PhysX unit such as 8800 GS sweat. 96 NVIDIA SPs is still a huge amount of rated shader compute power. 192 bit memory bus doesn't matter, the card isn't transferring large chunks of data (such a textures), it's just crunching lots of math in real-time. To look at it that way, if a PhysX title does have physics load that makes a dedicated 8800 GS sweat, a single GTX 280 machine is in for a significant graphics performance hit.
If he isn't getting a 8800 GS for less than $40 bucks, 9800 GT is cool.
i can get a 9800gt new in box for $40 but i'd have to pay retail for a 8800.
heat.... fuck heat... who cares anyway? it's for benching and i have some 130cfm fans i use to cool the vid cards anyway.
I think we both know each others points, but we just has focused the thing in a different way. This is how I see it, in order of importance:
- Price: IMHO, a 9800GT for $40 is must have. Period.
-Power consumption: the difference between both cards is 10W. The X2 alone consumes 300W. Add 150W for the HD4870 and 120W for the rest of the system, as well as 75 for the baseline card (GS) and we are talking about 645W under full load. 645 or 655 who cares?
- Performance: probably in the next year a GT will not get you better performance than the GS, maybe not even in 2 years? Who cares? You won't need to change the card. IMO you can't apply the same criteria as with graphics, where a little underpowered card makes more sense because you will need to upgrade it soon anyway. And if we take into account the price, are you really going to risk the future performance, or the possibility that you will need to upgrade the card a lot sooner in order to get a cheaper or a bit less power hungry card?
im also looking at resell value here. i can get the 9800gt and use it for a few months and still get more outta it than i paid.
not trying to dog on your btarunr
Then price is the only issue, go for it.
2 years from now you think the industry will let you use a 9800 GT for PhysX? You'll be hit by the standard-syndrome, they'll come up with "The latest PhysX engine requires a CUDA < insert advanced version here > -supportive graphics card", naturally all existing hardware will become 'obsolete'. Anyway, that's Fit's we're talking about. His hardware changes like the weather
what a really want to know is if it is worth the $40 or should i get something that has nothing to do with gfx/physx.
Something that can put those cards to use, a game. ...if not 9800 GT though for $40 nothing beats it.
i dont game though.
Separate names with a comma.