Thursday, July 10th 2008

R700 up to 80 % Faster than GeForce GTX 280

Pre-release performance evaluations suggest that the Radeon HD 4870 X2 2GB GDDR5 model will on an average be 50% faster than the GeForce GTX 280 and in some tests 80% faster. A second model, the HD 4850 X2 (2GB GDDR3 memory, 2x RV770Pro) will also convincingly outperform the GeForce GTX 280. The R700 series will be brought into the market late-July thru August.Source: Hardspell
Add your own comment

149 Comments on R700 up to 80 % Faster than GeForce GTX 280

#1
Millenia
by: vojc
True, we need better phenom CPUs, or else i go blue
Honestly you don't need THAT much out of a CPU for gaming nowadays, even a mid-range dual core will do more than enough..

Kinda funny as I would probably have more actual use for a quad as a 3d artist than most of TPU yet I don't have one :laugh:
Posted on Reply
#2
DarkMatter
by: Millenia
Kinda funny as I would probably have more actual use for a quad as a 3d artist than most of TPU yet I don't have one :laugh:
+1. I'm on the same circunstances. LOL. But we don't work at home anyway, do we?

I don't, and I want to... :cry:
Posted on Reply
#3
JC316
Knows what makes you tick
Now that is ownage, I just want AMD to come out with a chip that can do that to intel. That should keep the company alive for a while longer. Go ATI!
Posted on Reply
#4
Darkrealms
by: Darkrealms
[quote="PVTCaboose1337, post: 878775"]ATI wins. I am quite surprised, as that is a HUGE increase.
I agree with that statement if this is true. That would make it the fastest current card.
But if Nvidia managed a 280 x2 I think it would loose to that.[/quote]
by: robodude666
How many nVidia supports do you think will say it doesn't count because its not 1 GPU?
Ouch, see my above statement . . .
Posted on Reply
#5
Selene
Im glad this will bring the 260/280 prices down even more, also with the 55nm260/280s coming look out ATI.
Im sorry card for card NV owns, yea the prices were out of whack but thats being fixed, just goes to show being the first to have new stuff does indeed come at a price!
Posted on Reply
#6
[I.R.A]_FBi
by: candle_86
yea but there stock cooler is crap so that feature is worthless and its just a way for most people to damage something, i dont volt my video cards, i OC at stock volts and i care who has the headroom at stock voltage with stock cooling and IMO ATI doesnt have either
why do you have to be so openly ignorant?
Posted on Reply
#7
X-TeNDeR
by: Kei

No offense...but do you own a Phenom...or a 4xxx series card? I see your sig mentions you have a spider but you specs don't show anything that is spider at all? O.o

K
None taken. i had a 9500 B2 phenom for a while.. than had to sell it for a better rig overall, then got stuck with a budget. so got the cheapest rig for now, with phenom 9850BE on the way. my current rig is satisfying my needs for now so, no rush. :)
Posted on Reply
#8
sam0t
Just a quick check on ATI vs Nvidia prices from small Finland:

4850: 149e > 230$
4870: 230e > 354$

GTX 260: 286e > 440$
GTX 280: 399e > 614€
Posted on Reply
#9
vojc
yap prices in europe not buyer friendly :)
Posted on Reply
#10
NympH
by: [I.R.A]_FBi
owned ... can you say owned?
OWNED! :rockout:
Posted on Reply
#11

i think 4870x2 could be equal to GTX 280 x2 since they did that chip which provides better efficiency than other dual gpu solutions .
Posted on Edit | Reply
#12
SK-1
First time ATI has been on top in a LONG while,...x1900xtx comes to mind.
Good for ATI!
Posted on Reply
#13
chron
by: SK-1
First time ATI has been on top in a LONG while,...x1900xtx comes to mind.
Good for ATI!
yeah the x1900xtx was on top for a while, but then the 7900gtx came out and did better for like 100 bucks less. Why does that seem like a lifetime ago? :P
Posted on Reply
#14
KieranD
GTX 280 x2 prolly will be a little faster by 10 or 15% but efficiecy might go to ati.

If its 80% faster than 1 GTX 280, the dual version of the 280 is bound to be faster.
Posted on Reply
#15

by: KieranD
GTX 280 x2 prolly will be a little faster by 10 or 15% but efficiecy might go to ati.

If its 80% faster than 1 GTX 280, the dual version of the 280 is bound to be faster.
2 280's may not be as efficient .
Posted on Edit | Reply
#16
TooFast
by: chron
yeah the x1900xtx was on top for a while, but then the 7900gtx came out and did better for like 100 bucks less. Why does that seem like a lifetime ago? :P
dont forget the x1950 xtx was the king in that time:D
Posted on Reply
#17
WarEagleAU
Bird of Prey
Great news and what not, but its taking two cores to top on Nvidia gpu. Id love to see ATI get a one core stomper out there.
Posted on Reply
#18
Megasty
Now that's being pwned to death. NV needs to dump that gt200 completely. If they don't watch it, their next gen card won't even be as fast as the 4870x2 & yet cost just as much not if more than the GTX280. I thought the thing would pwn but this is ridiculous.
Posted on Reply
#19
VanguardGX
by: WarEagleAU
Great news and what not, but its taking two cores to top on Nvidia gpu. Id love to see ATI get a one core stomper out there.
I was wondering how long it would take for someone to mention that:) So what if it has 2 cores? Its the way of the future just like a dual core processor.
Posted on Reply
#20
Megasty
by: VanguardGX
I was wondering how long it would take for someone to mention that:) So what if it has 2 cores? Its the way of the future just like a dual core processor.
That really wasn't his point, ah it doesn't matter :rolleyes:

ATi pretty much has a game plan. They're not focusing as much on NV as NV is on them - for obvious reasons. If ATi continues on this path, they will have a single gpu that eats all comers. The 4870 is more than twice as powerful as a 3870. Its ridiculous to think that their next gpu will be 2X+ as powerful as a 4870 but the possibility is there. ATi also has rumored plans for a dual core gpu - & if it takes something like that to take the 'single' gpu crown then so be it. But the last thing ATi will do right now is stray away from their game plan (architecture) when its finally starting to bare sweet fruit.
Posted on Reply
#21
swaaye
How about we compare R700 (2xRV770) to GTX 280 SLI? :) Price is a bit different though I guess lol.

BTW, some of you really need to understand GPU architecture a bit better. R700 is not more efficient than a design based on a single giant GPU would be. At least, not for pure 3D performance. The fastest GPU design, due to how parallel 3D rendering is, is always a single GPU. Dual GPUs waste RAM and have to deal with inefficiencies caused by trying to split the tasks and communicate via a pathetically slow bridge chip. There is extra hardware and hardware performing redundant tasks. And the drivers have to be specially set up for every game basically (this is conveniently ignored by most people for some reason.)

The problem is that manufacturing technology can not cope with mega huge GPUs. That is why GT200 can't clock as high as G92. The bigger the chip gets with more transistors, the hotter it is and the more complex it becomes to make it stable at higher clock speeds. If you look back at how early GPUs barely needed fans to today's ridiculous furnaces, you see that manufacturing is way behind what competition has pushed GPUs to become.

R700 and 9800GX2 are designed to overcome manufacturing inadequacies in the only way possible. They also conveniently allow an entire lineup to be based mainly on a single GPU design. It's just important to realize that this is not the optimum way to go for performance.

Also, realize that there is potential for a refreshed GT200 to be vastly faster. If they shrink it down and tweak it, and this allows it to clock up decently higher, a dual GT250 (or whatever) could be a lot faster than R700.
Posted on Reply
#22
candle_86
by: TooFast
dont forget the x1950 xtx was the king in that time:D
nope, the 7950GX2 took the DX9 crown :D
Posted on Reply
#23
swaaye
by: candle_86
nope, the 7950GX2 took the DX9 crown :D
When it worked, and it didn't always work. The critical flaw of dual GPU cards = drivers and whether the game engine works with the method used to split the work across the two GPUs. For example, in Everquest 2, 7950GX2 was only as fast as one of its boards (=7900GT).

Besides, who can really claim a GF7 or X19xx as being crown of anything. Current cards run DX9 just fine and definitely look a hell of a lot better doing it than GF7. GF7's image quality was not all that great.
Posted on Reply
#24
brian.ca
by: DarkMatter
It's not too much hotter than HD4870. In fact it's cooler, but it does consume a bit more. Not much really under normal usage. It has a higher peak consumption but consumes less on average. Dual GPU cards never have both cards under full load. Also Nvidia just needs a GTX260 GX2 with higher clocks. Does not need GTX280 GX2 to be on top.
I haven't looked too deeply into this or anything but 2 points about what you just said,

1) Wasn't the default fan speed set lower than it needed to be? From what I read there was a simple fix for this and just turning the speed up was supposed to help a lot with the heat situation with out too much cost in noise. If that is the case, early benchmarks measuring heat don't really speak to the thermals of the chip itself.

2) It was my understanding from various posts that the issue with the 4870's idle power draw is that it's not downclocking properly for 2D mode so it continues at full strength readiness even when you're just reading emails. I believe this is just a driver issue and ATI said this will be resolved with a new catalyst release. So saying that the rv770 on avg draws more power seems like a faulty argument. It draws less at max and it currently draws more at idle but once fixed via a driver I'm not sure there's much reason to think that it wouldn't draw less power at idle as well. I'm assuming your avg was just load + idle / 2, if that's the case as soon as that's fixed the numbers should change significantly.
Posted on Reply
#25
purecain
the differance about this multi gpu situation, is that ATI designed their chip to work in multi chip config from the very beginning... nvidia did not....
come on ati.....
Posted on Reply
Add your own comment