• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

R700 up to 80 % Faster than GeForce GTX 280

yea but there stock cooler is crap so that feature is worthless and its just a way for most people to damage something, i dont volt my video cards, i OC at stock volts and i care who has the headroom at stock voltage with stock cooling and IMO ATI doesnt have either

please read some news first to see further stock(or non stock if u wish) coolers ;)
 
True, we need better phenom CPUs, or else i go blue

Honestly you don't need THAT much out of a CPU for gaming nowadays, even a mid-range dual core will do more than enough..

Kinda funny as I would probably have more actual use for a quad as a 3d artist than most of TPU yet I don't have one :laugh:
 
Kinda funny as I would probably have more actual use for a quad as a 3d artist than most of TPU yet I don't have one :laugh:

+1. I'm on the same circunstances. LOL. But we don't work at home anyway, do we?

I don't, and I want to... :cry:
 
Now that is ownage, I just want AMD to come out with a chip that can do that to intel. That should keep the company alive for a while longer. Go ATI!
 
Im glad this will bring the 260/280 prices down even more, also with the 55nm260/280s coming look out ATI.
Im sorry card for card NV owns, yea the prices were out of whack but thats being fixed, just goes to show being the first to have new stuff does indeed come at a price!
 
yea but there stock cooler is crap so that feature is worthless and its just a way for most people to damage something, i dont volt my video cards, i OC at stock volts and i care who has the headroom at stock voltage with stock cooling and IMO ATI doesnt have either

why do you have to be so openly ignorant?
 
No offense...but do you own a Phenom...or a 4xxx series card? I see your sig mentions you have a spider but you specs don't show anything that is spider at all? O.o

K

None taken. i had a 9500 B2 phenom for a while.. than had to sell it for a better rig overall, then got stuck with a budget. so got the cheapest rig for now, with phenom 9850BE on the way. my current rig is satisfying my needs for now so, no rush. :)
 
Just a quick check on ATI vs Nvidia prices from small Finland:

4850: 149e > 230$
4870: 230e > 354$

GTX 260: 286e > 440$
GTX 280: 399e > 614€
 
i think 4870x2 could be equal to GTX 280 x2 since they did that chip which provides better efficiency than other dual gpu solutions .
 
First time ATI has been on top in a LONG while,...x1900xtx comes to mind.
Good for ATI!
 
First time ATI has been on top in a LONG while,...x1900xtx comes to mind.
Good for ATI!

yeah the x1900xtx was on top for a while, but then the 7900gtx came out and did better for like 100 bucks less. Why does that seem like a lifetime ago? :P
 
GTX 280 x2 prolly will be a little faster by 10 or 15% but efficiecy might go to ati.

If its 80% faster than 1 GTX 280, the dual version of the 280 is bound to be faster.
 
GTX 280 x2 prolly will be a little faster by 10 or 15% but efficiecy might go to ati.

If its 80% faster than 1 GTX 280, the dual version of the 280 is bound to be faster.

2 280's may not be as efficient .
 
Last edited by a moderator:
yeah the x1900xtx was on top for a while, but then the 7900gtx came out and did better for like 100 bucks less. Why does that seem like a lifetime ago? :P
dont forget the x1950 xtx was the king in that time:D
 
Great news and what not, but its taking two cores to top on Nvidia gpu. Id love to see ATI get a one core stomper out there.
 
Now that's being pwned to death. NV needs to dump that gt200 completely. If they don't watch it, their next gen card won't even be as fast as the 4870x2 & yet cost just as much not if more than the GTX280. I thought the thing would pwn but this is ridiculous.
 
Great news and what not, but its taking two cores to top on Nvidia gpu. Id love to see ATI get a one core stomper out there.

I was wondering how long it would take for someone to mention that:) So what if it has 2 cores? Its the way of the future just like a dual core processor.
 
I was wondering how long it would take for someone to mention that:) So what if it has 2 cores? Its the way of the future just like a dual core processor.

That really wasn't his point, ah it doesn't matter :rolleyes:

ATi pretty much has a game plan. They're not focusing as much on NV as NV is on them - for obvious reasons. If ATi continues on this path, they will have a single gpu that eats all comers. The 4870 is more than twice as powerful as a 3870. Its ridiculous to think that their next gpu will be 2X+ as powerful as a 4870 but the possibility is there. ATi also has rumored plans for a dual core gpu - & if it takes something like that to take the 'single' gpu crown then so be it. But the last thing ATi will do right now is stray away from their game plan (architecture) when its finally starting to bare sweet fruit.
 
How about we compare R700 (2xRV770) to GTX 280 SLI? :) Price is a bit different though I guess lol.

BTW, some of you really need to understand GPU architecture a bit better. R700 is not more efficient than a design based on a single giant GPU would be. At least, not for pure 3D performance. The fastest GPU design, due to how parallel 3D rendering is, is always a single GPU. Dual GPUs waste RAM and have to deal with inefficiencies caused by trying to split the tasks and communicate via a pathetically slow bridge chip. There is extra hardware and hardware performing redundant tasks. And the drivers have to be specially set up for every game basically (this is conveniently ignored by most people for some reason.)

The problem is that manufacturing technology can not cope with mega huge GPUs. That is why GT200 can't clock as high as G92. The bigger the chip gets with more transistors, the hotter it is and the more complex it becomes to make it stable at higher clock speeds. If you look back at how early GPUs barely needed fans to today's ridiculous furnaces, you see that manufacturing is way behind what competition has pushed GPUs to become.

R700 and 9800GX2 are designed to overcome manufacturing inadequacies in the only way possible. They also conveniently allow an entire lineup to be based mainly on a single GPU design. It's just important to realize that this is not the optimum way to go for performance.

Also, realize that there is potential for a refreshed GT200 to be vastly faster. If they shrink it down and tweak it, and this allows it to clock up decently higher, a dual GT250 (or whatever) could be a lot faster than R700.
 
Last edited:
nope, the 7950GX2 took the DX9 crown :D

When it worked, and it didn't always work. The critical flaw of dual GPU cards = drivers and whether the game engine works with the method used to split the work across the two GPUs. For example, in Everquest 2, 7950GX2 was only as fast as one of its boards (=7900GT).

Besides, who can really claim a GF7 or X19xx as being crown of anything. Current cards run DX9 just fine and definitely look a hell of a lot better doing it than GF7. GF7's image quality was not all that great.
 
It's not too much hotter than HD4870. In fact it's cooler, but it does consume a bit more. Not much really under normal usage. It has a higher peak consumption but consumes less on average. Dual GPU cards never have both cards under full load. Also Nvidia just needs a GTX260 GX2 with higher clocks. Does not need GTX280 GX2 to be on top.

I haven't looked too deeply into this or anything but 2 points about what you just said,

1) Wasn't the default fan speed set lower than it needed to be? From what I read there was a simple fix for this and just turning the speed up was supposed to help a lot with the heat situation with out too much cost in noise. If that is the case, early benchmarks measuring heat don't really speak to the thermals of the chip itself.

2) It was my understanding from various posts that the issue with the 4870's idle power draw is that it's not downclocking properly for 2D mode so it continues at full strength readiness even when you're just reading emails. I believe this is just a driver issue and ATI said this will be resolved with a new catalyst release. So saying that the rv770 on avg draws more power seems like a faulty argument. It draws less at max and it currently draws more at idle but once fixed via a driver I'm not sure there's much reason to think that it wouldn't draw less power at idle as well. I'm assuming your avg was just load + idle / 2, if that's the case as soon as that's fixed the numbers should change significantly.
 
Back
Top