Thursday, July 10th 2008

R700 up to 80 % Faster than GeForce GTX 280

Pre-release performance evaluations suggest that the Radeon HD 4870 X2 2GB GDDR5 model will on an average be 50% faster than the GeForce GTX 280 and in some tests 80% faster. A second model, the HD 4850 X2 (2GB GDDR3 memory, 2x RV770Pro) will also convincingly outperform the GeForce GTX 280. The R700 series will be brought into the market late-July thru August.Source: Hardspell
Add your own comment

149 Comments on R700 up to 80 % Faster than GeForce GTX 280

#1
InnocentCriminal
Resident Grammar Amender
Let's hope we see these sort of figures, if so I'll be extremely happy with settling for a 4850X2 over a 4870X2. It would all depend on power consumption to closed my decision.
Posted on Reply
#2
kaneda
by: candle_86
nope, the 7950GX2 took the DX9 crown :D
no it didnt.

for one the moment you upped the AA/AF the GX2 had a fit.

also 2 X1950XTX's in CF > 2 GX2's in SLI.

HDR+AA :D
Posted on Reply
#3
InnocentCriminal
Resident Grammar Amender
by: kaneda
... also 2 X1950XTX's in CF...
Brings back memories. :toast:
Posted on Reply
#4
fullinfusion
1.21 Gigawatts
Kiss Amd's azz nvidia lmfao...:rockout:
Posted on Reply
#5
zOaib
can someone make my name red , i wanna be a fanboi representin !!! thx
Posted on Reply
#6
btarunr
Editor & Senior Moderator


Mouse over the username to see it in red whenever you feel the 'urge'. Shed fanboyism, be rational.
Posted on Reply
#7
DarkMatter
by: btarunr


Mouse over the username to see it in red whenever you feel the 'urge'. Shed fanboyism, be rational.
:roll:
Posted on Reply
#8
Megasty
by: btarunr


Mouse over the username to see it in red whenever you feel the 'urge'. Shed fanboyism, be rational.
Rational!?, wats dat :confused: I thought we all bought the biggest, most powerful thing we can't afford :( Dang now I have to find out what this 'rational' thing is :shadedshu

Fanboyism actually spouts from a form of rationalism. Some ppl buys a card from one camp - & it pleases them to no end, w/o giving them any problems. Fantasy worlds like that are then concreted. Fanboyism of that nature is hard to crack. On the other side of the tracks are fanboys who haven't bought a card in their lives :roll:

A rational person wouldn't even be able to compare a 4870x2 & a GTX280 mainly because the p/p ratios are way too different. That thing would need to be $350-400 to make any kind of rational comparison.
Posted on Reply
#9
btarunr
Editor & Senior Moderator
Rational = I will buy whatever is best for my money without bias toward either companies, NV / ATI.

I was going to buy a HD 2900 XT. Prices sucked compared to a 8800 GTS 640M. Bought the GTS. Then came 8800 GT that outperformed it. Sold the GTS for the same price at which at which 8800 GT could be bought. Got happy with the 8800 GT. Next time in the market if ATI offers the best for my cash, I will buy it, but if NV comes up with something even at the last moment, NV gets my cash.
Posted on Reply
#10
newconroer
by: btarunr
Pre-release performance evaluations suggest that the Radeon HD 4870 X2 2GB GDDR5 model will on an average be 50% faster than the GeForce GTX 280 and in some tests 80% faster. A second model, the HD 4850 X2 (2GB GDDR3 memory, 2x RV770Pro) will also convincingly outperform the GeForce GTX 280. The R700 series will be brought into the market late-July thru August.

Source: Hardspell
Good good...100+fps in games, just what I need.

But is it going to micro stutter? Is it going to drop from 100fps to 20 everytime your gun goes off?


The only thing the X2 will do, is push 280 prices down. The 280 will become even more of the ultimate gamer's card because it's single solution, and it eliminates the most common stutter issues.

ATi should have been happy with their jump from the 3x series to the 4x series when it comes to performance, and continued to try and push the 'price/performance' appeal for the mass market, using that income to rebuild their dwindling campaign. Now they risk wakening a sleeping dragon.

As I said before, ATi could possibly kill off the appeal for their 4850/70 product line; leaving them with more or less just the 'crown,' which Nvidia will turn around and trump because they have the resources to do so.
Posted on Reply
#11
btarunr
Editor & Senior Moderator
by: newconroer
Good good...100+fps in games, just what I need.

But is it going to micro stutter? Is it going to drop from 100fps to 20 everytime your gun goes off?


The only thing the X2 will do, is push 280 prices down. The 280 will become even more of the ultimate gamer's card because it's single solution, and it eliminates the most common stutter issues.

ATi should have been happy with their jump from the 3x series to the 4x series when it comes to performance, and continued to try and push the 'price/performance' appeal for the mass market, using that income to rebuild their dwindling campaign. Now they risk wakening a sleeping dragon.

As I said before, ATi could possibly kill off the appeal for their 4850/70 product line; leaving them with more or less just the 'crown,' which Nvidia will turn around and trump because they have the resources to do so.
http://forums.techpowerup.com/showpost.php?p=880187&postcount=30

...hope that helps.
Posted on Reply
#12
newconroer
Eh, one person, one game, doesn't give me much conclusive results.

We already see the 280 accomplishing the same thing, why jam a possibly more heat/power demanding component in your system if you don't need to?
Posted on Reply
#13
btarunr
Editor & Senior Moderator
One person..one game...at least it's better than "nothing to prove there's no stutter".
Posted on Reply
#14
Darkrealms
by: newconroer
Good good...100+fps in games, just what I need.

But is it going to micro stutter? Is it going to drop from 100fps to 20 everytime your gun goes off?


The only thing the X2 will do, is push 280 prices down. The 280 will become even more of the ultimate gamer's card because it's single solution, and it eliminates the most common stutter issues.

ATi should have been happy with their jump from the 3x series to the 4x series when it comes to performance, and continued to try and push the 'price/performance' appeal for the mass market, using that income to rebuild their dwindling campaign. Now they risk wakening a sleeping dragon.

As I said before, ATi could possibly kill off the appeal for their 4850/70 product line; leaving them with more or less just the 'crown,' which Nvidia will turn around and trump because they have the resources to do so.

Nice work, I can agree with that. Unfortunately everyone wants the crown. Nvidia has enough going wrong for them right now and they have the resources. If they have been lowballing because of the competition and the 300 series does exist with 45nm and GDDR5 this could just hurt ATI.
Posted on Reply
#15
steelkane
by: btarunr
Rational = I will buy whatever is best for my money without bias toward either companies, NV / ATI.

I was going to buy a HD 2900 XT. Prices sucked compared to a 8800 GTS 640M. Bought the GTS. Then came 8800 GT that outperformed it. Sold the GTS for the same price at which at which 8800 GT could be bought. Got happy with the 8800 GT. Next time in the market if ATI offers the best for my cash, I will buy it, but if NV comes up with something even at the last moment, NV gets my cash.
I agree with that statement 1000%, Choice is the greatest thing on EARTH.
Posted on Reply
#16
trt740
by: steelkane
I agree with that statement 1000%, Choice is the greatest thing on EARTH.
good luck on that they are coming out so fast it is impossible to tell whats the best buy.
Posted on Reply
#17
DarkMatter
Well, for me it's very easy to tell. The more cards out there the better, more choice. Even when you can't tell which one is the better one for your needs, it doesn't really matter: with more cards out there prices go down and even if you choose the "wrong" card you are always buying more for less than if there wasn't such extreme competition.

On another note, that competition in price IS HURTING the graphics industry, mainly driven by the prices of Ati. Not blaming, this is bussiness and that's what they have to do (they are almost forced to), and on the other hand it just happens it's so good for us. But I've been wondering lately if we are getting a little bit greedy: it's very common to see people complaining about prices and are not higher than in tha past, every year we want double the performance, but now we also want it to come at less money. There will be the eternal debate about if they were charging too much in the past and only now they are being honest with prices. IMO that's not the case. Average selling point has decreased a lot, while at the same time the percentage of high-end cards has increased. This means lower profits for the companies. And I know that people think "and what?", but IT companies are more fragile than what people think. IMO if the competition continues in this direction one of the companies could end up dissapearing and we would regret it. And in the long term this is bad for both Ati and Nvidia, any of the two could eventually dissapear. Ati's pricing estrategy is not sustainable, they can't continue selling the cards so cheap forever and what will happen when they release the profitable card? It won't be as good from a pref/price point of view and the market will be flooded with the previous cards. They would be forced to price the cards at low profitable prices again just as they are forcing Nvidia now, and of course Nvidia would be in the same situation.

I'm not saying we have to pay more for the same. This rant is not about the purchasing decisions, buy the better thing your money can buy. I just think we have to be more aware of the current situation at the time of complaining: current prices are not necesarily fair, as in absolute truth. Are very good for us, but I strongly believe we are heading to a situation where we would be paying less than what would be fair. That said, I love the situation, I'd love this to continue, but I want to stay away from hypocrisy. IMHO:

- That prices go down, even if they make less profit, but have enough to stay in the game without affecting their workers... GOOD
- That we benefit from that situation, even if we know it's not necesarily fair acording to an absolute truth. GOOD
- That we complain about the pricing when it doesn't fit our "distorted" expectations, even if they come at same prices as in the past. BAD, very bad.

My 2 cents.

Sorry for the rant. I just felt we needed a bit of seft-criticism and this was just the day of doing it. :o
Posted on Reply
#18
Nick89
____ gets the award for biggest nvidia fanboy in this thread.:roll:

Ok ____ we get it, you won! Arnt you happy? :laugh:


Insert name as you wish, as I cant decide...lol
Posted on Reply
#19
wolf
Performance Enthusiast
by: Nyte
The person who said "ati sux" edited their post conveniently after I posted mine.

Then I guess your scientific criterion for fair comparison would be that you have to have 2 ASIC GPU's on the same die (to be on the same level as comparing a dual core CPU to a single core CPU) am I right? Is this criterion defined in an ISO standard somewhere? I'd be interested to see that. Maybe HardOCP or Guru3D can begin to use this standard since it's so scientific.

You can never compare technologies like that. The only way you can ever make a fair performance comparison between 2 products is if they are in the same league of price, power consumption, featureset, and requirements. Comparing a "dual GPU" technology to a "single GPU" by implicitly denouncing the former is not a fair means of comparison in ANY standard (except for some of the posters in this thread).

"Cramming 2 GPU's to beat NVIDIA". That statement by itself is enough for any engineer to walk away because it clearly means the speaker knows nothing about ASIC design. Yields, cost, BOM, TDP, complexity... I guess I can throw all those factors away because as far as I know, AMD needs to cram 2 GPU's to beat NVIDIA, that MUST mean NVIDIA is better right?


My input on this matter is done.
my issue isnt with how its built, or the price, or anything pointed out in your fantastic "lecture", its with the problems inherent with Dual GPU solutions.

granted they have come along way, there are users out there who prefer one gpu to avoid any of those possible issues.

it doesnt really bother me who has the best performance, ill buy it anyway, ima get a 4870x2 for sure. however the day i crave is when either company can get 4870x2/GTX280SLi performance from a single gpu solution, you can never go wrong.
Posted on Reply
#20
Megasty
by: wolf
my issue isnt with how its built, or the price, or anything pointed out in your fantastic "lecture", its with the problems inherent with Dual GPU solutions.

granted they have come along way, there are users out there who prefer one gpu to avoid any of those possible issues.

it doesnt really bother me who has the best performance, ill buy it anyway, ima get a 4870x2 for sure. however the day i crave is when either company can get 4870x2/GTX280SLi performance from a single gpu solution, you can never go wrong.
That day will definitely come. Technology doesn't stop, meet walls, limits, or anything of that nature. Just like the GTX280 is as fast as a 9800GX2 & a 4870 is as fast as a 3870x2, single chips can come to meet the generational standards of dual chip predecessors. If that trend continue (although unlikely since it just started), then the next gen single chips will blow us away just as the single chips' improvements did in this series.
Posted on Reply
#21
handydagger
lol poor nvidia I'm imaging in their labs now they are working hard on DDR5 to be released on their core assuming they will slap ATI in the next round, while ATI is working on dual core GPU or maybe Quad Core GPU :P with 4-8 Crossfire GPU :P

Nvidia is lagging now, same with happened to 3dfx in the past few years ago and maybe Intel is going to take care of it later on :P
Posted on Reply
#22
HTC
What if both nVidia and ATI, in a near future, did like Intel and AMD did?

Imagine ...

- A "low end" card would be a 2 cores card in 1 GPU, much like the 48X0x2 or the 9800GX2, but in a single GPU.

- A "high end" card would be a 4 cores in 1 GPU card.

- The "single core" card would not even exist anymore, except very old cards.


Just like Intel and AMD, as far as i know, only make dual, triple (AMD), quads, and more, it wouldn't be that much of a stretch to guess that the GPU future would be a multi-core one.

IMO, the "single core" in a card will fade and it won't take very long: about 2 or 3 years, i reckon.


They will manage to make a dual core GPU just like a C2Ds, for example.



In order to make a dual core but single GPU, the size of 1 of the cores must be small in order to fit 2 of them in a single GPU.

This is where i believe ATI has the upper hand: they managed to get a powerful GPU with a "not big" die size where as nVidia did make a better card but with a "very big" die size.
Posted on Reply
#23
handydagger
ATI is coming closer to dual Core GPU than nvidia, assuming the next coming 4870x2 will have 2 separate cores on single card.
The further is in the multi cores.
Posted on Reply
#24
Hayder_Master
80% , with overclock the R700 can we say 1*hd R700= 2* gtx280 sli
Posted on Reply
Add your own comment