Monday, June 30th 2008

Two R700s Churn-out X12515 in 3DMark Vantage

AMD, Austin have managed a benchmark score of X12515 in the 3DMark Vantage benchmark using two Radeon HD4870 X2 cards in CrossfireX, a feat that takes three GeForce GTX 280 cards in 3-way SLI to achieve. The R700 boards were clocked at 778 MHz core, while the GDDR5 memory was clocked at 980 MHz QDR (effectively 3.92 GHz). This brings the total on-board video bandwidth to a stellar 250.8 GBps.

With inputs from TG Daily
Add your own comment

157 Comments on Two R700s Churn-out X12515 in 3DMark Vantage

#1
btarunr
Editor & Senior Moderator
Probing for, and adding details...
Posted on Reply
#2
tkpenalty
holy shit. There is still MUCH better value for performance in this case.... two HD4870X2s are cheaper than three GTX280s by a fair bit.

AMD really are on a roll. Considering only nvidia chipsets support CF, well..... Nvidia is going to go flying downhill
Posted on Reply
#3
cool_recep
They are coming dude, seriously...
Posted on Reply
#5
btarunr
Editor & Senior Moderator
Ahem, the score in X< number > please?
Posted on Reply
#6
DaMulta
My stars went supernova
by: btarunr
Ahem, the score in X< number > please?
p19337
Posted on Reply
#7
btarunr
Editor & Senior Moderator
by: DaMulta
p19337
I mean Xtreme settings (example X123456)
Posted on Reply
#8
PrudentPrincess
by: btarunr
AMD, Austin have managed a benchmark score of X12515 in the 3DMark Vantage benchmark using two Radeon HD4870 X2 cards in CrossfireX, a feat that takes three GeForce GTX 280 cards in 3-way SLI to achieve. The R700 boards were clocked at 778 MHz core, while the GDDR5 memory was clocked at 980 MHz QDR (effectively 3.92 GHz). This brings the total on-board video bandwidth to a stellar 250.8 GBps.




With inputs from TG Daily
Yeah impressive that they could beat three GTX280's, which are still single cored, and are running in the less-efficient tri-sli. AMD is really moving its way up the ladder. :laugh:
Posted on Reply
#9
DaMulta
My stars went supernova
by: btarunr
I mean Xtreme settings (example X123456)
Dont know that one yet.


ah I see now lol new to vantage
Posted on Reply
#10
btarunr
Editor & Senior Moderator
by: PrudentPrincess
Yeah impressive that they could beat three GTX280's, which are still single cored, and are running in the less-efficient tri-sli. AMD is really moving its way up the ladder. :laugh:
Look at it this way: 2x $500 equal 3x $650. ;)
Posted on Reply
#11
PrudentPrincess
by: btarunr
Look at it this way: 2x $500 equal 3x $650. ;)
Nvidia hasn't come out with a dual processor card for this generation, so benchmarks like this mean jack shit.
Posted on Reply
#12
a111087
Don't mind if I do : GO AMD!!! :D
Posted on Reply
#13
btarunr
Editor & Senior Moderator
by: PrudentPrincess
Nvidia hasn't come out with a dual processor card for this generation, so benchmarks like this mean jack shit.
Who cares about what the card is made of, or how many GPU's it has? As long as it performs on par with an expensive monolithic GPU and remains cheap?
Posted on Reply
#14
DaMulta
My stars went supernova
I was wrong my x score is x6282

GPU 6052
CPU 40461

two r700 together is twice as fast.
Posted on Reply
#15
btarunr
Editor & Senior Moderator
by: DaMulta
I was wrong my x score is x6282

GPU 6052
CPU 40461

two r700 together is twice as fast.
If Wile E still has that Maxie board, he'd be grinning now. :D
Posted on Reply
#16
Morgoth
i love it ^^ cant wait to own that beast
Posted on Reply
#17
substance90
I am so glad that AMD is finally coming back to the scene! 4870 X2 kicks ass very hard! If it`s under $450, that will be my next card.
Posted on Reply
#18
DaMulta
My stars went supernova
by: btarunr
If Wile E still has that Maxie board, he'd be grinning now. :D
Ture, but for now the 9800GTX cards still fly with whats on the market with vantage.
Posted on Reply
#19
mnm222876
So how fast will the nVidia 280 x2 card be?

Don't you guys think AMD will get spanked again as soon as the x2 version of the 280 comes out?
Posted on Reply
#20
Morgoth
the sould do this benchmark on X58 :P
Posted on Reply
#21
btarunr
Editor & Senior Moderator
by: mnm222876
So how fast will the nVidia 280 x2 card be?

Don't you guys think AMD will get spanked again as soon as the x2 version of the 280 comes out?
Until that happens, they hold the crown. With NVIDIA over-reacting to the RV770 (by slashing prices many fold) and the UMAP, the message that goes out is that it will be a while before NV can counter the RV770 / R700 threat using "something new".
Posted on Reply
#22
tkpenalty
by: PrudentPrincess
Nvidia hasn't come out with a dual processor card for this generation, so benchmarks like this mean jack shit.
Sorry but nvidia doesnt always have to make dual cards. You're talking about a card which uses 400W in total. I dont think anyone would want that in their system. is there a moral obligation against dual GPU vs single GPU? GPUs arent like CPUs mate on paper nvidia's GTX280 should be raping the HD4870, BADLY.
Posted on Reply
#23
tkpenalty
by: mnm222876
So how fast will the nVidia 280 x2 card be?

Don't you guys think AMD will get spanked again as soon as the x2 version of the 280 comes out?
remember one GTX280 core uses 200W of power. the GTX280 CANNOT take much higher temps than what it runs at, due to the sheer size (technically its a fragile card in contrast to the RV770 thanks to the massive package). 2 GTX280 in close proximity jammed into one slot is suicide.

if nvidia decide to phase out the 9800/9600s, that will be their doom.
Posted on Reply
#24
vojc
by: PrudentPrincess
Nvidia hasn't come out with a dual processor card for this generation, so benchmarks like this mean jack shit.
....and they can/t becouse TDP of 280GTX is on level of x2 4870 ;) so 4870 X2 has TDP 250W, 280GTX GX2 would be TDP 450W kapis?
2 ATI chps size equals one size of nvidia chip
Posted on Reply
#25
mandelore
thats fantastic, way to go ATI...

again, just a benchmark, but one hellova benchmark!

the money saved in getting 2 4870x2's compared to 3 280's is stellar, you could buy a killer mainboard for that and probs some decent ram, lol...
Posted on Reply
Add your own comment