Thursday, June 12th 2008

3-way SLI Action with NVIDIA GeForce GTX 280 and 3DMark Vantage

A little joy for today, one week before the official announcement of the NVIDIA GeForce GTX 280 cards. Here's a little sneak peak on what to expect from three NVIDIA GeForce GTX280 cards in tri-SLI configuration, an overclocked Intel QX9650 processor to 4GHz and the 3DMark Vantage Vista DX10 benchmark. Clock speeds of all three cards can be seen in the photo. The end result is 21350 marks.

Source: VR-Zone
Add your own comment

37 Comments on 3-way SLI Action with NVIDIA GeForce GTX 280 and 3DMark Vantage

#1
MKmods
Case Mod Guru
That would be so perfect in my new SFF case:rockout:

(thanks again malware for all the good posts)
Posted on Reply
#2
Castiel
HOLY HELL:eek:

But I would like to see it tested with a AMD Proc.
Posted on Reply
#3

by: EMok1d08
HOLY HELL:eek:

But I would like to see it tested with a AMD Proc.
i have to ask...... why? for the love of god why?
Posted on Edit | Reply
#4
MKmods
Case Mod Guru
by: EMok1d08
HOLY HELL:eek:

But I would like to see it tested with a AMD Proc.
It would be nice to know if/how much of a difference it really makes.
Posted on Reply
#5
razaron
@ gallatin : lol
this supposed gtx 280 benchmark is twice as good as the supposed 4850 CF benchmark
Posted on Reply
#6
a_ump
that's not surprising, though those scores sound damn good since vantage is the newest and is supposed to be the benchmark for dx10 and they can already dominate it.though as always i'm sure vantage is no better a interpretation of how performance will be in games than 06 was.
Posted on Reply
#7
Castiel
by: Gallatin
i have to ask...... why? for the love of god why?
Look at my avatar.
And I would also see how much of a difference it would be if you used a AMD chip.
Posted on Reply
#8
Weer
This is nice but what I really want to see are two things:

1. Overclocking the Shader Core as far as it will go.
2. Playing Crysis - the only game I can't max out.

In honesty, these cards are worthless for today's games, because I doubt that any game in the next year will be as difficult to run as Crysis and so that is the only reason to buy them. My system can run any other game maxed out. And since I've already played Crysis.. twice, these cards are of no value to me for a long time. I'll be getting the ones that come after them.. that is if we are still able to discern which ones those will be, as nVidia keeps altering the names ridiculously.
Posted on Reply
#9
Megasty
Tri-SLI is great & all but it is 2 grand later... That GPU score is insane as it should be :p
Posted on Reply
#10
MKmods
Case Mod Guru
when I read the thread It mentioned that quad didnt work as well as tri. Hopefully Nvidia will let the Gt versions do Tri SLI, that would be a great and a bit more cost effective.
Posted on Reply
#11
Squirrely
Quite impressive.

Little overkill though, as most games can't use their full potential, unless you are using them at a ridiculous res. :laugh:
Posted on Reply
#12
eidairaman1
more grains of salt, id rather see people here testing the cards, even wizzard so we know they arent touching any internal tweak settings.
Posted on Reply
#13
eidairaman1
by: Weer
This is nice but what I really want to see are two things:

1. Overclocking the Shader Core as far as it will go.
2. Playing Crysis - the only game I can't max out.

In honesty, these cards are worthless for today's games, because I doubt that any game in the next year will be as difficult to run as Crysis and so that is the only reason to buy them. My system can run any other game maxed out. And since I've already played Crysis.. twice, these cards are of no value to me for a long time. I'll be getting the ones that come after them.. that is if we are still able to discern which ones those will be, as nVidia keeps altering the names ridiculously.
WHy should anyone require Crossfire/SLI to Play games, seriously, im pretty sure these driver modders can get more out of the cards than having to buy another card. BTW Crysis doesnt play well with any Multicard setup, even 1920x1200 it chokes.
Another Point Id rather be playing FPS at 60 on 1280x1024 than 30 FPS on 2***x16** 30 FPS is barely Playable, because you gotta account for dips in performance every so often.
Posted on Reply
#14
Edito
Nice scores now lets wait and see in real games benchmarks to see how well it performs...
Posted on Reply
#16
jbunch07
impressive!
most impressive!
Posted on Reply
#17
HaZe303
If 2 gx2 cards can get P23286, 21000+ isnt that impressive from tri sli of a 3 NEW gen cards?? It should be atleast 50% better if they want me to cough up 500euros for 1 card?? No way, im going with ATI cards this time. Im getting used to the lower priced cards as GT and GTS and 3870/4870 have. Nvidia will have a hard time selling me those super expencive cards in the future, atleast until there are mor PC exclusive games like Crysis. No most games on pc are ports of console games, those games are very well optimized and run sweet on pc´s with semi decent gfx cards, so a 4870 will do fine for me. And it will be cooler and draw less power as well!
Posted on Reply
#18
HTC
There's something wrong with this score, IMHO!

Look in this thread and notice the BIG jump when the 2nd GPU was enabled.

If this card (not SLIed) doesn't score @ least 10K in Vantage, it will be a disappointment and 3 of these ought to scale terribly well in Vantage, unlike 3D06, so i would expect it to score on the 23K-24K neighborhood, @ least.
Posted on Reply
#19
a_ump
by: eidairaman1
WHy should anyone require Crossfire/SLI to Play games, seriously, im pretty sure these driver modders can get more out of the cards than having to buy another card. BTW Crysis doesnt play well with any Multicard setup, even 1920x1200 it chokes.
Another Point Id rather be playing FPS at 60 on 1280x1024 than 30 FPS on 2***x16** 30 FPS is barely Playable, because you gotta account for dips in performance every so often.
yes, i would agree with you, were it another game. but one thing i've always noticed about crysis is even at the 20 FPS avg i play it at it seemed solid, which was weird as shit compared to css when if i get 20FPS its really really bad. i never noticed the lag in crysis until i got down to 15 or so, so yea i was laggin my lowest i ever hit was i thk 12FPS so put that card in my rig and even at 30FPS i dout i'd see the 15FPS where i notice lag. i don't know what crytek did but maybe its the coding that helps with the lag, cause i've never played a game where i would consider 20FPS stable or acceptable.

and its not just me check this site
http://forums.pureoverclock.com/showthread.php?p=14934#post14934
Posted on Reply
#20
panchoman
Sold my stars!
the first thing that game to my mind besides the score was gpu-z in that picture! i must say, congrats w1zz, gpu-z has already become a standard tool for many overclockers, etc. and it continues to gain in popularity!
Posted on Reply
#21
freaksavior
To infinity ... and beyond!
thats nice, but like everyone else is saying, i want real world benches i.e public benches with multiple system.
Posted on Reply
#22
panchoman
Sold my stars!
now wheres the oct xfire that was promised to us by ati?
Posted on Reply
#23
imperialreign
I wonder how much that OC had to do with the final score, though

QX9650 OCed to 4GHz, DDR3 OCed at 1GHz . . .
Posted on Reply
#24
HTC
by: imperialreign
I wonder how much that OC had to do with the final score, though

QX9650 OCed to 4GHz, DDR3 OCed at 1GHz . . .
The CPU score was lower then the GPU score so, if anything, it held back the Vantage final result.
Posted on Reply
#25
DarkMatter
by: a_ump
yes, i would agree with you, were it another game. but one thing i've always noticed about crysis is even at the 20 FPS avg i play it at it seemed solid, which was weird as shit compared to css when if i get 20FPS its really really bad. i never noticed the lag in crysis until i got down to 15 or so, so yea i was laggin my lowest i ever hit was i thk 12FPS so put that card in my rig and even at 30FPS i dout i'd see the 15FPS where i notice lag. i don't know what crytek did but maybe its the coding that helps with the lag, cause i've never played a game where i would consider 20FPS stable or acceptable.

and its not just me check this site
http://forums.pureoverclock.com/showthread.php?p=14934#post14934
It's because of many things, but the most important is that everything in Crysis has a separate thread in CPU and memory: renderer, physics, AI, controls. And the engine will try to balance them acording to the situation. That means that unlike every other games (that I can think off at least) in Crysis "lag" in any of those systems doesn't affect the others. Usually graphics, physics or AI lag is traduced into mouse or net lag (extremely true for CSS), but this doen't happen in Crysis or is mitigated a lot which means almost the same in practice.

EDIT: That system is suposedly smart enough to change physics and AI "resolution" or "framerate" on the fly if it sees too many workload on the CPU and thinks it's not going to be able to handle it. What I mean with framerate and resolution there there is that, instead of calculating physics and AI for every frame it will do it at a lower speed than the renderer (framerate). Or it can calulate less interactions per frame (resolution). Anything to find the balance it needs.
Posted on Reply
Add your own comment