• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

3-way SLI Action with NVIDIA GeForce GTX 280 and 3DMark Vantage

malware

New Member
Joined
Nov 7, 2004
Messages
5,422 (0.72/day)
Location
Bulgaria
Processor Intel Core 2 Quad Q6600 G0 VID: 1.2125
Motherboard GIGABYTE GA-P35-DS3P rev.2.0
Cooling Thermalright Ultra-120 eXtreme + Noctua NF-S12 Fan
Memory 4x1 GB PQI DDR2 PC2-6400
Video Card(s) Colorful iGame Radeon HD 4890 1 GB GDDR5
Storage 2x 500 GB Seagate Barracuda 7200.11 32 MB RAID0
Display(s) BenQ G2400W 24-inch WideScreen LCD
Case Cooler Master COSMOS RC-1000 (sold), Cooler Master HAF-932 (delivered)
Audio Device(s) Creative X-Fi XtremeMusic + Logitech Z-5500 Digital THX
Power Supply Chieftec CFT-1000G-DF 1kW
Software Laptop: Lenovo 3000 N200 C2DT2310/3GB/120GB/GF7300/15.4"/Razer
A little joy for today, one week before the official announcement of the NVIDIA GeForce GTX 280 cards. Here's a little sneak peak on what to expect from three NVIDIA GeForce GTX280 cards in tri-SLI configuration, an overclocked Intel QX9650 processor to 4GHz and the 3DMark Vantage Vista DX10 benchmark. Clock speeds of all three cards can be seen in the photo. The end result is 21350 marks.



View at TechPowerUp Main Site
 
That would be so perfect in my new SFF case:rockout:

(thanks again malware for all the good posts)
 
HOLY HELL:eek:

But I would like to see it tested with a AMD Proc.
 
  • Like
Reactions: Kei
HOLY HELL:eek:

But I would like to see it tested with a AMD Proc.

It would be nice to know if/how much of a difference it really makes.
 
@ gallatin : lol
this supposed gtx 280 benchmark is twice as good as the supposed 4850 CF benchmark
 
that's not surprising, though those scores sound damn good since vantage is the newest and is supposed to be the benchmark for dx10 and they can already dominate it.though as always i'm sure vantage is no better a interpretation of how performance will be in games than 06 was.
 
i have to ask...... why? for the love of god why?

Look at my avatar.
And I would also see how much of a difference it would be if you used a AMD chip.
 
This is nice but what I really want to see are two things:

1. Overclocking the Shader Core as far as it will go.
2. Playing Crysis - the only game I can't max out.

In honesty, these cards are worthless for today's games, because I doubt that any game in the next year will be as difficult to run as Crysis and so that is the only reason to buy them. My system can run any other game maxed out. And since I've already played Crysis.. twice, these cards are of no value to me for a long time. I'll be getting the ones that come after them.. that is if we are still able to discern which ones those will be, as nVidia keeps altering the names ridiculously.
 
Tri-SLI is great & all but it is 2 grand later... That GPU score is insane as it should be :p
 
when I read the thread It mentioned that quad didnt work as well as tri. Hopefully Nvidia will let the Gt versions do Tri SLI, that would be a great and a bit more cost effective.
 
Quite impressive.

Little overkill though, as most games can't use their full potential, unless you are using them at a ridiculous res. :laugh:
 
more grains of salt, id rather see people here testing the cards, even wizzard so we know they arent touching any internal tweak settings.
 
This is nice but what I really want to see are two things:

1. Overclocking the Shader Core as far as it will go.
2. Playing Crysis - the only game I can't max out.

In honesty, these cards are worthless for today's games, because I doubt that any game in the next year will be as difficult to run as Crysis and so that is the only reason to buy them. My system can run any other game maxed out. And since I've already played Crysis.. twice, these cards are of no value to me for a long time. I'll be getting the ones that come after them.. that is if we are still able to discern which ones those will be, as nVidia keeps altering the names ridiculously.

WHy should anyone require Crossfire/SLI to Play games, seriously, im pretty sure these driver modders can get more out of the cards than having to buy another card. BTW Crysis doesnt play well with any Multicard setup, even 1920x1200 it chokes.
Another Point Id rather be playing FPS at 60 on 1280x1024 than 30 FPS on 2***x16** 30 FPS is barely Playable, because you gotta account for dips in performance every so often.
 
Nice scores now lets wait and see in real games benchmarks to see how well it performs...
 
impressive!
most impressive!
 
If 2 gx2 cards can get P23286, 21000+ isnt that impressive from tri sli of a 3 NEW gen cards?? It should be atleast 50% better if they want me to cough up 500euros for 1 card?? No way, im going with ATI cards this time. Im getting used to the lower priced cards as GT and GTS and 3870/4870 have. Nvidia will have a hard time selling me those super expencive cards in the future, atleast until there are mor PC exclusive games like Crysis. No most games on pc are ports of console games, those games are very well optimized and run sweet on pc´s with semi decent gfx cards, so a 4870 will do fine for me. And it will be cooler and draw less power as well!
 
There's something wrong with this score, IMHO!

Look in this thread and notice the BIG jump when the 2nd GPU was enabled.

If this card (not SLIed) doesn't score @ least 10K in Vantage, it will be a disappointment and 3 of these ought to scale terribly well in Vantage, unlike 3D06, so i would expect it to score on the 23K-24K neighborhood, @ least.
 
WHy should anyone require Crossfire/SLI to Play games, seriously, im pretty sure these driver modders can get more out of the cards than having to buy another card. BTW Crysis doesnt play well with any Multicard setup, even 1920x1200 it chokes.
Another Point Id rather be playing FPS at 60 on 1280x1024 than 30 FPS on 2***x16** 30 FPS is barely Playable, because you gotta account for dips in performance every so often.

yes, i would agree with you, were it another game. but one thing i've always noticed about crysis is even at the 20 FPS avg i play it at it seemed solid, which was weird as shit compared to css when if i get 20FPS its really really bad. i never noticed the lag in crysis until i got down to 15 or so, so yea i was laggin my lowest i ever hit was i thk 12FPS so put that card in my rig and even at 30FPS i dout i'd see the 15FPS where i notice lag. i don't know what crytek did but maybe its the coding that helps with the lag, cause i've never played a game where i would consider 20FPS stable or acceptable.

and its not just me check this site
http://forums.pureoverclock.com/showthread.php?p=14934#post14934
 
the first thing that game to my mind besides the score was gpu-z in that picture! i must say, congrats w1zz, gpu-z has already become a standard tool for many overclockers, etc. and it continues to gain in popularity!
 
thats nice, but like everyone else is saying, i want real world benches i.e public benches with multiple system.
 
now wheres the oct xfire that was promised to us by ati?
 
I wonder how much that OC had to do with the final score, though

QX9650 OCed to 4GHz, DDR3 OCed at 1GHz . . .
 
I wonder how much that OC had to do with the final score, though

QX9650 OCed to 4GHz, DDR3 OCed at 1GHz . . .

The CPU score was lower then the GPU score so, if anything, it held back the Vantage final result.
 
Back
Top