Thursday, July 10th 2008

R700 up to 80 % Faster than GeForce GTX 280

Pre-release performance evaluations suggest that the Radeon HD 4870 X2 2GB GDDR5 model will on an average be 50% faster than the GeForce GTX 280 and in some tests 80% faster. A second model, the HD 4850 X2 (2GB GDDR3 memory, 2x RV770Pro) will also convincingly outperform the GeForce GTX 280. The R700 series will be brought into the market late-July thru August.Source: Hardspell
Add your own comment

149 Comments on R700 up to 80 % Faster than GeForce GTX 280

#1

by: indybird
If nvidia takes the GX2 route again with the GTX 280, then (assuming the GX2 performs equal to two GTX 280s in SLI), it will be approximately equal to the HD4870X2.

Heres my reasoning:
-In SLI the second GTX 280 adds anywhere from 50% to 80% of the performance of a single card (according to most reviews).
-The ATI HD4870X2 is claimed to be 50% to 80% faster than the GTX 280
-Therefore, unless nvidia improves their scaling or ATI's isnt as good as they are currently claiming, these two cards will be equal in performance
-Based on the cost of a 65nm GTX 280 core, the GTX 280 will be very expensive: ~$700.
exactly wat i said a few posts ago . :)
Posted on Edit | Reply
#2
InfDamarvel
Ati/AMD has been putting so much time in this dual gpu solution that they have finally began to perfect it. It seems that Nvidia may have taken to much time on making a single gpu solution, which may be quite powerful, isn't very cost effective. And at the same time they let Ati take over dual gpu solutions even though they started the entire trend.

What a interesting time in the market lol.
Posted on Reply
#3
yogurt_21
by: KainXS
I think Nvidia might skip a GX2 this time around, compared to what im seeing with the scores the R700 is pulling out and what SLI GTX280's do, which from what i've seen they scale about 10-45% on average on this review, releasing a GX2 would be murder, its costs alot to make those G200 cores and even when the die strinks they're still going to cost alot to make, they're going to have to speed up development of the G300's

and why would any of you want a GTX280-GX2, that card would cost at least 900 dollars when you see that the 280's cost about 500-550 dollars now, unless you have money to burn and are a major Nvidia fan, thats a waste of money.

Nvidia totally :nutkick: this time

I want to wait for the R800's before I buy another card, if the R700 can pull 80% off though then I will definitely buy that

http://www.tbreak.com/reviews/article.php?cat=grfx&id=618&pagenumber=10
that reviewer needs a lesson in sli, I call shenanagans. I mean seriously the scores couln't be more inconsistant if they were made up entirely. lol
Posted on Reply
#4
wolf
Performance Enthusiast
and just like before, they cant beat nvidias card unless they cram on 2 gpu's.
Posted on Reply
#5
GPUCafe
GPU Cafe Representative
by: wolf
and just like before, they cant beat nvidias card unless they cram on 2 gpu's.
Like before? This is exactly like 7950GX2 versus X1900. Smaller efficient chips versus bigger brute-force chip.
Posted on Reply
#6
wolf
Performance Enthusiast
no this is exactly like 3870x2 vs 8800/9800 single gpu. when you put them in SLi, they no doubt beat ATi's counterpart.

hec even 9600GT SLi give a 3870X2 a damn good run for its money, i'd say roughly even, not to mention for all the price/performance fanatics out there, as i remember it, the 9600GT SLi option was cheaper.
Posted on Reply
#7
btarunr
Editor & Senior Moderator
by: wolf
no this is exactly like 3870x2 vs 8800/9800 single gpu. when you put them in SLi, they no doubt beat ATi's counterpart.
and cost exponentially more....and void the convinience of running two GPUs on a single slot...
Posted on Reply
#8
imperialreign
by: SK-1
:roll: Indica or Sativa imperialreign?:roll:
neither . . . (sadly, and thankfully, I can't do that anymore)

it was another off-handed reference to one of the funniest movies of all time . . .

I was only trying to lighten the mood . . . :ohwell:
Posted on Reply
#9
wolf
Performance Enthusiast
by: btarunr
and cost exponentially more....and void the convinience of running two GPUs on a single slot...
the ratio isnt exponential, please dont exaggerate that much, and like i said, as for price, 9600GT.

ive said it before and ill say it again, the people out there who want THE BEST performance, will throw money at it. i know alot of people like that. they all choose nvidia because it makes the beefyest GPU's
Posted on Reply
#10
Megasty
by: GPUCafe
Like before? This is exactly like 7950GX2 versus X1900. Smaller efficient chips versus bigger brute-force chip.
Now that was a GPU war if I ever remembered one. If that GX2 would have gotten off the ground, it would have eaten the X1950XTX alive. Too bad it didn't even come close to beating it because it was flawed from the ground up.

by: wolf
and just like before, they cant beat nvidias card unless they cram on 2 gpu's.
Does it really matters. Performance wise, 1 of these OC'd will come close to matching 2 GTX280s. But if you really need that much performance & have a grand to waste, 2 of these will murder 2 GTX280s (let alone 3 of them). There's no way I would waste a $1000+ on 2 GTX280s when 2 4870x2s stomp all over them.
Posted on Reply
#11
wolf
Performance Enthusiast
because we've all seen how awesome quad GPU scaling is right?
Posted on Reply
#12
farlex85
by: wolf
the ratio isnt exponential, please dont exaggerate that much, and like i said, as for price, 9600GT.

ive said it before and ill say it again, the people out there who want THE BEST performance, will throw money at it. i know alot of people like that. they all choose nvidia because it makes the beefyest GPU's
Don't expect nvidia to do it again. Ati's method of smaller quicker gpu's has won out, we likely won't see another monolithic from nvidia, it wouldn't make much sense for them to. Those people you speak of are likely just inclined to root for nvidia b/c it's their brand, I'm not gonna say fanboy, I'll say trusted brand. Many who want the best performance will anylise the situation and choose the best gpu out there and not simply throw away their money on something that someone else does better for less.

by: wolf
because we've all seen how awesome quad GPU scaling is right?
We have actually. The gx2 and x2 have both come near 100% scaling across all 4 cores in certain applications. You've got to admit it's getting better, better all the time.......
Posted on Reply
#13
magibeg
by: wolf
because we've all seen how awesome quad GPU scaling is right?
Its a completely different style of GPU scaling though, you can't say that the same way until we see it. Lets all wait for the numbers. Why don't we all just have red and green sigs from now on. Would make discussion easier :banghead:
Posted on Reply
#14
wolf
Performance Enthusiast
i mean that for both companies dude, nvidias and ati's 4 gpu scaling is retarded so far.

and as for sigs dont count me as green boy, im getting a 4870X2 also, ive gotta see what its all about :toast:

what im saying in effect tho is if i had the money i would probably also go for GTX280 SLi and race them :)
Posted on Reply
#15
Megasty
by: wolf
because we've all seen how awesome quad GPU scaling is right?
I did have 2 3870x2s. With the 8.6, they scaled great, even though up until 8.5 there was hardly no difference between them & tri-fire :rolleyes:

3 way-SLI is even worse. The 3rd GPU doesn't even exist with most setups (9800GTX & GTX280/260). The fact that ATi has nearly perfected their dual GPU architecture says wonders for how scaling has progressed over the years. 3 & 4 GPU scaling still has a long way to go but atleast its better than nothing.
Posted on Reply
#16
Bjorn_Of_Iceland
4870 in xfire is indeed faster than a GTX 280. But a GTX 280 in SLI still pwns.. Price is not right though.

4870x2 in quad xfire is insane!
Posted on Reply
#17
Nyte
There's alot of people in here that need take a course in ASIC design sheesh...

You guys are saying "ATI sux" because it takes 2 to defeat 1? I guess you can make the same analogy with it takes Dual Core CPU's to beat 1? Smarten up.
Posted on Reply
#18
wolf
Performance Enthusiast
you smarten up, were comparing 2 VERY different products here, its not as simple as single core vs dual core.

remember also that dual cores are 2 execution cores, on the same die. dual gpu solutions are not.

its not us who need lessons in design, and also, which person said "ati sux" ?
Posted on Reply
#19
Makaveli
lol its easy to spot the green team cheerleaders in this thread.

Bring on the 4870 X2 and whatever NV's answer will be!!!

I want more price drops let the children fight over who is better.
Posted on Reply
#20
candle_86
you say GTX 280 SLI sucks, yes right now it does, is it drivers or not enough power to drive even one card availble right now?

These cards i compare to the 8800GTX simply because i can, when they came out Core2 Adoption was still low as alot of people where still using 939 and AM2 based rigs, remember Core2 wasnt yet 6months old when the G80 came out, and we saw what when the world went to Core2? Almost 30% increase in FPS for the G80 over the FX62 that was the top CPU out there. Given the GTX280 is so powerful can even a modern QX9770 drive this thing honestly at its best? I wait on nahlem and ill bet you money we see massive gains from Nvidia and not so massive from AMD. You watch it happened once it will again
Posted on Reply
#21
echo75
i dont want to sound like a pessimist but i will belive it when i see it, thats how there was uber hype about the 4870 could perform this and that...meanwhile in real life me and many others never saw that hero performance. Maybe it due to our own hardware limitations , driver incompatibilities of whatever....however it still remains that " i believe when i see it"
Posted on Reply
#22
Nyte
by: wolf
you smarten up, were comparing 2 VERY different products here, its not as simple as single core vs dual core.

remember also that dual cores are 2 execution cores, on the same die. dual gpu solutions are not.

its not us who need lessons in design, and also, which person said "ati sux" ?
The person who said "ati sux" edited their post conveniently after I posted mine.

Then I guess your scientific criterion for fair comparison would be that you have to have 2 ASIC GPU's on the same die (to be on the same level as comparing a dual core CPU to a single core CPU) am I right? Is this criterion defined in an ISO standard somewhere? I'd be interested to see that. Maybe HardOCP or Guru3D can begin to use this standard since it's so scientific.

You can never compare technologies like that. The only way you can ever make a fair performance comparison between 2 products is if they are in the same league of price, power consumption, featureset, and requirements. Comparing a "dual GPU" technology to a "single GPU" by implicitly denouncing the former is not a fair means of comparison in ANY standard (except for some of the posters in this thread).

"Cramming 2 GPU's to beat NVIDIA". That statement by itself is enough for any engineer to walk away because it clearly means the speaker knows nothing about ASIC design. Yields, cost, BOM, TDP, complexity... I guess I can throw all those factors away because as far as I know, AMD needs to cram 2 GPU's to beat NVIDIA, that MUST mean NVIDIA is better right?


My input on this matter is done.
Posted on Reply
#23
GLD
One gpu for me please. a Quad core cpu, now were talking. :)
Posted on Reply
#24
bigtye
Regardless of wether your an Nvidia or ATI loyal customer, these new cards from both sides are a big performance gain over the ones they outdate. Does this mean we can expect a new era of games graphics to begin shortly?

After all, they can only code to what the hardware can produce. Currently my 9600GT plays everything I want and it looks good. I can't justify buying one of these new cards just yet 'cause mine still works, as did my 1950 pro which was replaced. However when new stuff comes out I will be "forced to upgrade" (that's for the benefit of my wife, the "forced to" bit:D)

I am looking forward to the new generation of games which these new cards will hopefully encourage and make possible.

Cheeper prices, new games! Woot! Sounds good.:)

Tye
Posted on Reply
#25
vojc
by: wolf
and just like before, they cant beat nvidias card unless they cram on 2 gpu's.
yeah the point is that 2 ATI GPUs size equals 1GPU of Nvidia ;) and....
at the same die size ati is up to 80% faster and 10-20% more power hunger
Posted on Reply
Add your own comment