Tuesday, January 22nd 2008

ATI Radeon HD 3870 X2 (R680) 1GB First Full Review Posted

The guys over at PConline.com.cn have managed again to be the first to post a full review of a graphics card that should be available later next month, the ATI Radeon HD 3870 X2 1GB (R680). First thing you'll notice is that the card beats NVIDIA's GeForce 8800 Ultra in every Futuremark benchmark and almost every game by quite a margin. Have a good time reading the full story here.
Source: PConline.com.cn
Add your own comment

144 Comments on ATI Radeon HD 3870 X2 (R680) 1GB First Full Review Posted

#126
btarunr
Editor & Senior Moderator
What's interesting is the variations in the results of the PCOnline.com.cn review to the Tom's review. Even more interesting is the UT III scores considering it's another of those titles rumored to 'favour' GeForce.
Posted on Reply
#127
Scrizz
man i never got to read toms review
Posted on Reply
#128
btarunr
Editor & Senior Moderator
I was wondering if any reviewer benched a HD3870 X2 against two HD3870 cards in CrossFire.
Posted on Reply
#129
btarunr
Editor & Senior Moderator
Scrizzman i never got to read toms review
Yup, the webmaster is moving the links we have to wait till he fixes them. The link and the pics worked a few hours ago and Mussles read the review.

Try going here: www.tomshardware.com/graphics/index.html and clicking on the article: "ATI Radeon HD 3870 X2 - Fastest Yet!" and see if it works. If you get a 404 from Tom's, then they're still fixing it.
Posted on Reply
#130
Scrizz
nope it's not up yet

oh and on a side note I can't activate my account on LoG
Posted on Reply
#131
tkpenalty
Its faster than an Ultra.... costs as much as a GTX (Which seems to be EOL to me) and runs cooler. (As well as taking two slots instead of four). Nice card :).

Guys, you have to remember that ATi has the performance crown now.

Yes power usage is 30W more but its not like you will run into any cooling issues... RV670s run cool. It would be better if some manufacturer makes one of these with two VF700ALCUs or even VF900CUs... I mean you can install them yourself as there is enough space :p. Lets just hope that the non-reference cooling R680s will come with the stiffening bar that the reference has. Its a bit long though... GTX Length.
Posted on Reply
#132
tkpenalty
Musselsyou COULD run the games/programs windowed... but thats crap. Quite often moving the mouse out of the window in an RTS results the game minimising or otherwise going stupid.

Its not just a problem with older games, as many MODERN games dont properly suppot widescreen.

Widescreen comes in two flavours vert- or hor+
Verical -, gives you the same horizontal view as a 4:3 image, but cuts the top off - you get LESS image than a 4:3 user would.

hor+ gives you the same vertical view, with more on the sides (real widescreen)

If i have a game with vert- (such as Bioshock on first release) i would rather play at 1280x1024, than lose part of the graphics.

Also... even with a GTX and a quad, there are the odd game that i cant max out (crysis cough) Why the hell should i run in a window, if i cant run at max res? how do people with 1080p screens handle this kind of thing on ATI??
You are overreacting (sorry to answer an old question), but blame the problem on your monitor because 22inch monitors i've seen automatically add the black bars. (Haven't touched samsung, you might as well blame them instead).
Posted on Reply
#133
Nyte
FYI, just so that reviewers don't trounce on AMD and make its stock plummet...

All the reviews so far (3 as of writing this post) seem to have not take notice of the fact that this is NOT a PCIe 2.0 card. Let me explain...

The HD3870 ASIC's themselves are PCIe 2.0 compliant. However, did they ever wonder how these 2 GPU's talk to each other? Well, there's this switch chip that is situated between the 2 GPU's (you can see it in the review photos as well).

This switch is a PCIe 1.1 part. This means that all data transfer in and out of this switch (ie. between the GPU's and motherboard) are at 1.1 speeds and not at 2.0 speeds.

This isn't a bad thing. We will not reach 2.0 bandwidth in quite some time considering that even AGP 8x is still holding its own. But this should be known NOW so no customer should purchase it and go all CRAZY saying its not 2.0 and starts a smear campaign all over the net thus pulling AMD's stock price down (NVIDIA Fanboys included in this smear campaign rally).

The reviewers are most likely not checking the PCI configuration space of the devices (ie. the switch) and relying on Catalyst Control Center (which reports the capabilities of the HD3870).


My 2 cents.
Posted on Reply
#134
erocker
*
NyteFYI, just so that reviewers don't trounce on AMD and make its stock plummet...

All the reviews so far (3 as of writing this post) seem to have not take notice of the fact that this is NOT a PCIe 2.0 card. Let me explain...

The HD3870 ASIC's themselves are PCIe 2.0 compliant. However, did they ever wonder how these 2 GPU's talk to each other? Well, there's this switch chip that is situated between the 2 GPU's (you can see it in the review photos as well).

This switch is a PCIe 1.1 part. This means that all data transfer in and out of this switch (ie. between the GPU's and motherboard) are at 1.1 speeds and not at 2.0 speeds.

This isn't a bad thing. We will not reach 2.0 bandwidth in quite some time considering that even AGP 8x is still holding its own. But this should be known NOW so no customer should purchase it and go all CRAZY saying its not 2.0 and starts a smear campaign all over the net thus pulling AMD's stock price down (NVIDIA Fanboys included in this smear campaign rally).

The reviewers are most likely not checking the PCI configuration space of the devices (ie. the switch) and relying on Catalyst Control Center (which reports the capabilities of the HD3870).


My 2 cents.
Thing is, it doesn't require a pci-e 2.0 bridge between the two chips, a pci-e 1.1 is plenty sufficient, but when you need to send info from both gpu's to the motherboard pci-e 2.0 is needed.
Posted on Reply
#135
Nyte
erockerThing is, it doesn't require a pci-e 2.0 bridge between the two chips, a pci-e 1.1 is plenty sufficient, but when you need to send info from both gpu's to the motherboard pci-e 2.0 is needed.
It's not needed because the slave GPU will never need to talk to the system. The slave GPU is only there to update the framebuffer for its own portion (by portion, I mean every other frame or half a frame). The slave will never read from the system and it will never write to the system. The master GPU coordinates all of that.
Posted on Reply
#136
erocker
*
I'm kinda losing you.. Are you saying the pci-e 2.0 isn't needed or the pci-e 1.1?
Posted on Reply
#137
Nyte
erockerI'm kinda losing you.. Are you saying the pci-e 2.0 isn't needed or the pci-e 1.1?
There is no PCIe 2.0 on the board is what I'm saying. The switch chip which arbitrates communication between the 2 HD3870's is 1.1.

Think of the switch chip as a central connection between the 2 GPU's and motherboard (it has 3 connection ports). The data transfers therefore are limited to 1.1 speeds (which is definitely sufficient).

The reviewers all report that the board operates at 2.0 speeds which is not true. That's all.
Posted on Reply
#138
erocker
*
Makes sense.:) Are there any benchmarks using a 1.1 vs. 2.0 motherboard out there?
Posted on Reply
#139
Nyte
erockerMakes sense.:) Are there any benchmarks using a 1.1 vs. 2.0 motherboard out there?
There definitely is. The X38's, RD790's, 780i's (this is not true 2.0, it also uses a 1.1 switch) are all 2.0 compliant. You obviously need a 2.0 graphics card as well but the performance difference is minimal (if any exist at all).

AGP 8x is still comparable with 1.1. And 2.0 is double 1.1...

What's 2 x minimal equal to?
Posted on Reply
#140
erocker
*
Well, I would be more interested in seeing benchmarks from a pci-e 2.0 motherboard vs. say a 680i chipset, or x975.
Posted on Reply
#141
Mussels
Freshwater Moderator
tkpenaltyYou are overreacting (sorry to answer an old question), but blame the problem on your monitor because 22inch monitors i've seen automatically add the black bars. (Haven't touched samsung, you might as well blame them instead).
Not really over-reacting... and i;ve used a lot of monitors. In my experience only dell and apple (24" and above) add the bars.


Feel free to prove me wrong - show me it. LOTS of people argue this, mostly because of some mistake they made (some people saw it on movies, others didnt realise its 4:3 games that screw up, and so on)

I've tried 22" models from the $300 to $600 (au) price range from asus, acer, dell (not ultrasharp) Chi Mei, CMV, and one generic chinese one i couldnt remember. Most of them are new enough to support HDCP, one even had HDMI support - none of them had in built scaling. My TV does, but pretty much only a few expensive models support it and they're all 24" or larger, or cost over $2K
Posted on Reply
#142
Edito
tkpenaltyIts faster than an Ultra.... costs as much as a GTX (Which seems to be EOL to me) and runs cooler. (As well as taking two slots instead of four). Nice card :).

Guys, you have to remember that ATi has the performance crown now.

Yes power usage is 30W more but its not like you will run into any cooling issues... RV670s run cool. It would be better if some manufacturer makes one of these with two VF700ALCUs or even VF900CUs... I mean you can install them yourself as there is enough space :p. Lets just hope that the non-reference cooling R680s will come with the stiffening bar that the reference has. Its a bit long though... GTX Length.
I don´t think ATI\AMD has the performance crown because there is no big diference between the cards and the nvidia cards has better performance according to the benchmarks forums.techpowerup.com/showthread.php?t=44484 i just think ATI is doin just fine because they are doin much just to get closer to the actual nvidia performance i mean dual GPU on the 3870 X2 in some benchs its inferior to a single 8800Ultra with that results i can´t say ATI has the performance crown don´t forget im not ATI hater im just tell what i think its the truth.
Posted on Reply
#143
eidairaman1
The Exiled Airman
WildCat87I won't argue about the drivers.

A brand new, high-end, dual-GPU video card, even if only $50, getting just 1 frame more than an old, single-GPU card is still weak. Excellent price/performance ratio, but I was truly expecting more as far as pure performance, even with launch drivers.
Boy you are the one to talk, yet you own 2 ATI Products. Also from a Starting Price Point the performance ratio to the dollar is far better than that of the 8800 Ultra, i mean seriously paying 800 bux for a gaming card is dumb, id understand paying that much for a Quadro or FireGL card.
Also you gotta Realize that Software hasnt yet reached Dual Chip GPU capability that great as it does on Multicore CPUs, I recall Single Core CPUs outperforming the Dual Core CPUs back in the day, well it took time for software to mature.
Posted on Reply
#144
Mussels
Freshwater Moderator
i beleive wildcats points is that he was expecting more of a performance leap.

Wildcat: FYI, VERY few launches make great performance leaps. The core 2 duo line and the 8800 series were the first 'huge leaps' in a long time. Generally performance increases gradually. between generations over a period of 2-3 years, not this incredible doubling of performance.

These performance leaps are rare, we just got lucky having two (CPU and GPU) happen at the same time.
Posted on Reply
Add your own comment
Apr 26th, 2024 05:06 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts