Monday, June 30th 2008

Two R700s Churn-out X12515 in 3DMark Vantage

AMD, Austin have managed a benchmark score of X12515 in the 3DMark Vantage benchmark using two Radeon HD4870 X2 cards in CrossfireX, a feat that takes three GeForce GTX 280 cards in 3-way SLI to achieve. The R700 boards were clocked at 778 MHz core, while the GDDR5 memory was clocked at 980 MHz QDR (effectively 3.92 GHz). This brings the total on-board video bandwidth to a stellar 250.8 GBps.

With inputs from TG Daily
Add your own comment

157 Comments on Two R700s Churn-out X12515 in 3DMark Vantage

#1
Voyager
:toast: great!

Will be there any 4600 series?
Posted on Reply
#2
zOaib
WarEagleAU said:
Behold the King, the King of Kings! Bow down to the, bow down to the King! (Triple H's Music)

That is very awesome. But remember, 3 GTX 280s are single gpus. The x2 equals out to 4 total so.
4 = 1000 usd

3= 1950 usd

if u get 4 for 950 less isnt that a bargain plus you dont have to own your own power generating stattion for the triple sli .....

what i dont understand is ppl talking about it isnt fair to use 2 gpus to beat single gpu , well i got one word ( actually its a sentence ).......... if my V8 costs less to beat an over priced V6 car ........... which one shud i get .
Posted on Reply
#3
mdm-adph
Voyager said:
:toast: great!

Will be there any 4600 series?
Yeah, what about us midrange people? :laugh:

Seriously, I was actually going to CF two 3650's together just for the hell of it, but only if I can get a 3650 really cheap one day.
Posted on Reply
#4
DanTheBanjoman
Señor Moderator
vojc said:
....and they can/t becouse TDP of 280GTX is on level of x2 4870 ;) so 4870 X2 has TDP 250W, 280GTX GX2 would be TDP 450W kapis?
2 ATI chps size equals one size of nvidia chip
That, and bta's financial argument are bad for NV. Can't compete at power consumption o price level. However, that 9800GTX+ doesn't look bad, so I guess NV will be fine in the segments where the money is.
As long as NV markets their card as the fastest and responds to the x2 with "but those are two cards" the world will still fall for it. And last time I checked, NV is better at marketing than ATI/AMD.

Also it's "capisce", as it isn't the most friendly choice of words you should at least spell it correctly.
Posted on Reply
#5
Megasty
WarEagleAU said:
Behold the King, the King of Kings! Bow down to the, bow down to the King! (Triple H's Music)

That is very awesome. But remember, 3 GTX 280s are single gpus. The x2 equals out to 4 total so.
:toast: :roll:

Which one would I rather buy, a $500 (AA) that completely beats a $650 (B) that hardly beats a $300 (A): AA>>B>A so $500>>$650>$300. Its a Paradox that will never make sense - unless NV lower their prices or ppl just blow sick amounts of cash on names instead of researching first. That's why kids can't afford things of this nature...
Posted on Reply
#6
btarunr
Editor & Senior Moderator
Even in mainstream segments, there's a bad deal from NV. Sure, for $230 you get a 9800 GTX+, but for $69 more you get a HD4870 that equals/beats the GTX 260 and the 9800 GX2 according to some reviews.

Something hints that this 2x R700 bench was run on a machine running a Phenom X4.
Posted on Reply
#7
DanTheBanjoman
Señor Moderator
btarunr said:
Even in mainstream segments, there's a bad deal from NV. Sure, for $230 you get a 9800 GTX+, but for $69 more you get a HD4870 that equals/beats the GTX 260 and the 9800 GX2 according to some reviews.

Something hints that this 2x R700 bench was run on a machine running a Phenom X4.
There always is the for $x more you can have a *insert item* argument.
Besides, $69 is 30% more, not exactly "a bit' more. Considering it isn't near 30% faster than a 9800GTX and the + being faster than the normal GTX I hardly believe NV is that far behind.
Posted on Reply
#8
VanguardGX
Why do people use the fact that the X2 is a dual chip card as if it’s a bad thing? So what if it’s a dual chip card, it’s gonna be faster than the GTX280 and not to mention cheaper.
Another thing is people keep sayin ATi will lead til the green team makes a 280GX2? Cmon people lets be serious that’s not gonna happened, well not this generation. Do you want a GPU that burns 400+ watts? Didn’t think so.
Posted on Reply
#9
MoA
VanguardGX said:
Why do people use the fact that the X2 is a dual chip card as if it’s a bad thing? So what if it’s a dual chip card, it’s gonna be faster than the GTX280 and not to mention cheaper.
Another thing is people keep sayin ATi will lead til the green team makes a 280GX2? Cmon people lets be serious that’s not gonna happened, well not this generation. Do you want a GPU that burns 400+ watts? Didn’t think so.
hahah pretty obvious answer:
because they need a reason for themselves to believe why Nvidia is better :P
Posted on Reply
#10
vojc
it is what intel does with quad core, just stick 2 + 2 cores, do does AMD/ati at graphic market.
Nvidia on other side is don`t know how to do single board dual gpu, so they are only able to stick 2 boards together
Posted on Reply
#11
btarunr
Editor & Senior Moderator
vojc said:
it is what intel does with quad core, just stick 2 + 2 cores, do does AMD/ati at graphic market.
Nvidia on other side is don`t know how to do single board dual gpu, so they are only able to stick 2 boards together
It's not that they don't know. Never underestimate the engineering prowess of NVIDIA. It's just that the power and thermal characteristics of their GPUs don't allow sticking two of them onto one board.
Posted on Reply
#12
0o0o0
PrudentPrincess said:
Nvidia hasn't come out with a dual processor card for this generation, so benchmarks like this mean jack shit.
Uhum,
Do you really think that Nvidia is going to make a GTX280 X2? 2x 500mm² on 1 card, then you need to have an extreme fan to keep them cool, with a standard fan they could reach 100-110°C.
And euh, 1 GTX280 costs 650$, a GTX280 X2 would be 1200$ or so. Who would buy that?

Congratz AMD, nice job :respect:
Posted on Reply
#13
DaMulta
My stars went supernova

http://www.xtremesystems.org/forums/showthread.php?t=191313
k|ngp|n did that on all air. That's 3 GPUs vs 4 gpus. If Nvidia does come out with a dual card again, I think it could really put a hurting on the x2.(this is before physics drivers I think)


Then he turned around and did this wow.


0o0o0 said:
Uhum,
Do you really think that Nvidia is going to make a GTX280 X2? 2x 500mm² on 1 card, then you need to have an extreme fan to keep them cool, with a standard fan they could reach 100-110°C.
And euh, 1 GTX280 costs 650$, a GTX280 X2 would be 1200$ or so. Who would buy that?

Congratz AMD, nice job :respect:
Maybe on the next die shrink.
Posted on Reply
#14
Morgoth
now try to beat hd4870x2 in crossfire with bloomfield on 4ghz ;)
Posted on Reply
#15
btarunr
Editor & Senior Moderator
Wonder what a brutal overclocker such as k|p can do to these cards on a Intel platform. In similar publications by both TG Daily and Tom's Hardware, the slide (in the first post) uses a resource-name "3dmarkonnextgenphenom" leading me to guess they ran it on a Bulldozer :confused:

Posted on Reply
#18
DanTheBanjoman
Señor Moderator
mlupple said:
DaMulta, no offense, but for someone who spends 5 million dollars on graphics cards, you are a moron.
For someone with two posts, one being fanboyism and another being a direct insult towards another member you deserve a warning. Consider this it.
Posted on Reply
#19
DaMulta
My stars went supernova
mlupple said:
DaMulta, no offense, but for someone who spends 5 million dollars on graphics cards, you are a moron.
When you compete in 3dmark score is all that matters, not how much it cost.:)

It would still cost a thousand dollars for 2 x2 cards which is also a lot of money.

VanguardGX said:
@DaMulta
First thing the 4870X2 is still seen as 1 card even though it has two cores. So its still 2 cards VS 3.

EDIT: and as i keep saying 280GX2, not gonna happen anytime soon.

EDIT again: Hope this does not piss of any Nv fan boys in the building but i just thought it was funny:)http://news.softpedia.com/images/news2/NVIDIA-to-Release-GTX-280-GX2-4.jpg
The x2 is going to be a great card for the money, but when you want more than that the 280 will be the way to go. I know that it is considered a single card, but the fact is that it is 3 cores vs 4.
Posted on Reply
#20
farlex85
DaMulta said:

The x2 is going to be a great card for the money, but when you want more than that the 280 will be the way to go. I know that it is considered a single card, but the fact is that it is 3 cores vs 4.
I don't think so, b/c as far as I know you still can't put 4 single cards in sli. This means maxed out ati becomes the top dog (and costs much less doing so). The only way nvidia can re-claim the performance crown now is if and when they are able to make a dual gt200, which may be a while.
Posted on Reply
#21
Morgoth
btarunr said:
Wonder what a brutal overclocker such as k|p can do to these cards on a Intel platform. In similar publications by both TG Daily and Tom's Hardware, the slide (in the first post) uses a resource-name "3dmarkonnextgenphenom" leading me to guess they ran it on a Bulldozer :confused:


lol if that was true, then its true that nehalem is again a superior architecture
Posted on Reply
#22
DaMulta
My stars went supernova
farlex85 said:
I don't think so, b/c as far as I know you still can't put 4 single cards in sli. This means maxed out ati becomes the top dog (and costs much less doing so). The only way nvidia can re-claim the performance crown now is if and when they are able to make a dual gt200, which may be a while.
I don't know if they can take the top 280 score in vantage as it is now. Yes, I think it will be a while before and if we see a dual card with the 280. But they do have loads of cash that they could throw at it and have it on the market in record time.
Posted on Reply
#23
btarunr
Editor & Senior Moderator
Morgoth said:
lol if that was true, then its true that nehalem is again a superior architecture
go to the tgdaily link in the 1st post. The Phenom 9950 manages 11000-change xtreme CPU score. DaMulta's YF manages 40000 in performance score.

I bet this would have been a better bench if they ran it on a Intel setup though I'm just guessing they didn't.
Posted on Reply
#24
farlex85
btarunr said:
go to the tgdaily link in the 1st post. The Phenom 9950 manages 11000-change xtreme CPU score. DaMulta's YF manages 40000 in performance score.

I bet this would have been a better bench if they ran it on a Intel setup though I'm just guessing they didn't.
That's physX causing the 40k cpu score, intel or amd doesn't have as much to do w/ it. W/o physX he still gets like 18k on the cpu score though. I get 30k in vantage cpu w/ a 6750 w/ physX. :laugh:
Posted on Reply
#25
vojc
as far as i know test has been done on 3.3GHz 65nm phenom, next gen phenom is only 45nm and 3 or 6MB L3, so not a big deal, still slover than q6600 OCed.
so wait for bench on 4GHz quad intel proc. (must get ~x15000).
......and for all nvidiots well 4 GPU vs nvidia 3 GPU that is 10cm^2 VS 15CM^2 on nvidia, or if i say 500w TDP VS 700W TDP or better....... ~900$ VS 1800$ :)
i hope that all nvidiots understand what i am saying
Posted on Reply
Add your own comment