Tuesday, March 30th 2010

XFX Abandons GeForce GTX 400 Series

XFX is getting cozier with AMD by the day, which is an eyesore for NVIDIA. Amidst the launch of GeForce GTX 400 series, XFX did what could have been unimaginable a few months ago: abandon NVIDIA's high-end GPU launch. That's right, XFX has decided against making and selling GeForce GTX 480 and GeForce GTX 470 graphics cards, saying that it favours high-end GPUs from AMD, instead. This comes even as XFX seemed to have been ready with its own product art. Apart from making new non-reference design SKUs for pretty-much every Radeon HD 5000 series GPU, the company is working on even more premium graphics cards targeted at NVIDIA's high-end GPUs.

The rift between XFX and NVIDIA became quite apparent when XFX outright bashed NVIDIA's high-end lineup in a recent press communication about a new high-end Radeon-based graphics card it's designing. "XFX have always developed the most powerful, versatile Gaming weapons in the world - and have just stepped up to the gaming plate and launched something spectacular that may well literally blow the current NVIDIA offerings clean away," adding "GTX480 and GTX470 are upon us, but perhaps the time has come to Ferm up who really has the big Guns." The move may come to the disappointment of some potential buyers of GTX 400 series, as XFX's popular Double Lifetime Warranty scheme would be missed. XFX however, maintains that it may choose to work on lower-end Fermi-derivatives.
Source: HardwareCanucks
Add your own comment

199 Comments on XFX Abandons GeForce GTX 400 Series

#151
cdawall
where the hell are my stars
phanbueyDoes the speed/way in which a card demands power also make a difference?... i.e. If i plug something into my PSU that instantly demands 700W, and then the next nanosecond, 100W, and then 850W again in a few ms later, repeatedly. Does that make any difference on the strain for the psu instead of just demanding 700W continuously?
No idea to be honest
newtekie1I don't see why that matters exactly? The end result is, for the less heat and power, Fermi provides more performance then ATi's last generation's high end card. In any other situation that would have been praised as amazing. It is only in the shadow of RV870 that Fermi doesn't look great.

Here is an interesting little tidbit of information: Not once has ATi been able to release a single GPU card that actually outperformed every card from the previous generation. This includes the HD5870. However, nVidia has with Fermi.

You know, it kind of makes me wonder what the power and heat of RV870 would be like if they did push it to that level of performance...
Hey what last gen card out does the 5970 and generation before that what out did the 4870x2
Posted on Reply
#152
newtekie1
Semi-Retired Folder
cdawallNo idea to be honest



Hey what last gen card out does the 5970 and generation before that what out did the 4870x2
I guess you don't know what Single GPU means...

I'm guessing you also missed the point I was making entirely.
Posted on Reply
#153
cdawall
where the hell are my stars
newtekie1I guess you don't know what Single GPU means...

I'm guessing you also missed the point I was making entirely.
Maybe you missed that ati has this thing called crossfire and they use it to make cards like the 4870x2 and 5970 which use 2 gpus to render you know its kinda like a dual core cpu oh wait we shouldn't have those that's a mark of progress and should be shunned
Posted on Reply
#154
crow1001
Wile EI have whole house AC, and it would be cooler than my X2 anyway.



Fermi outperforms both the 5870 an the 4870x2. Voltage specs don't matter, only the end results. In this, the end result is a whole lot of wattage and heat. lol.

I don't see anything that makes Fermi a terrible card. I likewise don't really see anything that makes it a great card.
LMAO, you don't see anything that makes fermi "480" a terrible card, how about on par or slightly better performance than a card released six months a go, yeah you get the odd TWIMTBP title where it's higher but who gives a crap, the thing is hotter than the sun and sounds like a tornado, its consumes more watts than the Large Hadron Collider, overclock the six month old 5870 and it will leave the flawed 480 in it's wake, yeah nothing bad..muhhahaha.
Posted on Reply
#155
newtekie1
Semi-Retired Folder
cdawallMaybe you missed that ati has this thing called crossfire and they use it to make cards like the 4870x2 and 5970 which use 2 gpus to render you know its kinda like a dual core cpu oh wait we shouldn't have those that's a mark of progress and should be shunned
Yeah yeah, and nVidia uses SLI to do the same thing, that wasn't the point at all.

I'll explain it again, Fermi is the first single GPU that we have seen released that actually tops the previous generation's dual GPUs. That is a huge feat, one that hasn't been done ever before. In any other situation that alone would have made Fermi get praised as a wonderful GPU. And in the case of the HD4870x2, Fermi actually does it with less power and less heat, making it even more amazing.

I'm not ignoring the dual GPU cards, obviously they exist and provide amazing performance, that wasn't my point at all. Ignoring the complications that they add, they definitely are the top dogs. However, that again was not the point. The accomplishment of a single GPU topping the previous generation's dual-GPU cards has never been done, and Fermi doing it while using less power and producing less heat shows that it is an utterly amazing GPU. However, RV870 is even better for other reasons. If it wasn't for RV870 having the perfect balance of power usage/heat output/price and performance, Fermi would probably be praised right now instead of bashed.
Posted on Reply
#156
cdawall
where the hell are my stars
newtekie1Yeah yeah, and nVidia uses SLI to do the same thing, that wasn't the point at all.

I'll explain it again, Fermi is the first single GPU that we have seen released that actually tops the previous generation's dual GPUs. That is a huge feat, one that hasn't been done ever before. In any other situation that alone would have made Fermi get praised as a wonderful GPU. And in the case of the HD4870x2, Fermi actually does it with less power and less heat, making it even more amazing.

I'm not ignoring the dual GPU cards, obviously they exist and provide amazing performance, that wasn't my point at all. Ignoring the complications that they add, they definitely are the top dogs. However, that again was not the point. The accomplishment of a single GPU topping the previous generation's dual-GPU cards has never been done, and Fermi doing it while using less power and producing less heat shows that it is an utterly amazing GPU. However, RV870 is even better for other reasons. If it wasn't for RV870 having the perfect balance of power usage/heat output/price and performance, Fermi would probably be praised right now instead of bashed.
Only complaint I have is the 480 doesn't put out less heat or use less juice from what I have seen its smoking psu's we never saw that with 4870x2...the 5870 is also 6 months old and beat the 4870x2 in almost everything so how is it not in the same boat as fermi?
Posted on Reply
#157
HalfAHertz
@newtekie1: I think they misunderstood what you're trying to say. If you allow me to clear it up - GTX480>GTX295>4870x2, he's not referring to it as doubling the performance of the GTX280/285 but only as matching the dual solutions from last gen.
Posted on Reply
#158
newtekie1
Semi-Retired Folder
cdawallOnly complaint I have is the 480 doesn't put out less heat or use less juice from what I have seen its smoking psu's we never saw that with 4870x2...the 5870 is also 6 months old and beat the 4870x2 in almost everything so how is it not in the same boat as fermi?
I don't care what you've seen, every competent review shows less heat and less power usage than the HD4870x2.

And the HD5870 doesn't beat the HD4870x2 in almost everything, in fact overall the HD5870 is about 5% behind the HD4870x2, that means that the HD4870x2 beats the HD5870 in more things...:slap: That is how it is not in the same boat as Fermi. Beating the HD4870x2 in a few things, but losing overall, it still a lose. Fermin wins overall compared to the HD4870x2.
Posted on Reply
#159
AzureOfTheSky
so newtelie1, the reports that are floating around the net of people with 1+kw psu's (quility units like pcp&c,fortron,silverstone, and tt toughpower) blowing after people left them looping 3dmark/furmark/heaven when they say went to take a shower dont matter?

The thinking in each of these cases is that its probbly due to the card continuing to draw more and more power the longer its under load, one review i found showed 2 480's(reviewer also tested sli but it a seprate revies) each of them drew 340watts and hit 101c after being left running 3dmark or furmark or heaven for an extended time, it wasnt hours but the guy did leave it running a good while to simulate a real gaming session playing a stressfull game in a common case(think it was an antec 900 or something like that)

the fact is, as the TPU review shows, the card can/does pull more power then nvidia wants to admit, it runs hot even at idle, and really dosnt perform that great if you get right for its specs.

1. an overclocked 5780 will be faster then a 480, even an overclocked one(they dont overclock well)
2. even overclocked the 5780 couldnt draw as much power or create as much heat as the 480.

3. in idle the 5k cards use VERY LITTLE POWER and run VERY COOL

nVidia should have TESTED b4 they went to mass production, I know amd/ati do, after the 2900 they went back to testing b4 they sent cards/chips to mass production, they ensure they will be able to keep them within a reasonable power/heat threshold had nV done proper testing b4 they started mass production they could have avoided
1. having such a hot card thats well below the planned specs.
2. having such a high fail rate on cores (make something hard to produce and its going to have higher fail rates)
3. being 6 months late to market trying to work around heat and production problems.

yes the 5k cards had problems to, BUT notice they got worked out and yeilds are well above 7.1%

nV screwed up, I would guess they know they screwed up and are working hard to get out either a refresh OR a re-designed product that wont be so damn hot.

I wonder if even a water cooler like the older toxic cards used could keep these things heat in check....i have a feeling it would take a 120mm rad or dual 120mm rad even to do the job...
Posted on Reply
#160
mdm-adph
newtekie1I'll explain it again, Fermi is the first single GPU that we have seen released that actually tops the previous generation's dual GPUs. That is a huge feat, one that hasn't been done ever before.
Newtekie, I've said it before, but you deserve every single cent Nvidia is paying you. You are able to effectively polish a turd like Fermi into something resembling a diamond better than anyone I've seen on the net, and statements like that just prove it. :laugh:
Posted on Reply
#161
digibucc
newtekie1Fermi wins overall compared to the HD4870x2.
what about a comparable card from NVidias lineup last generation? idk would that be a 295?

the 480 is not better all around than the 295 is it?
Posted on Reply
#162
mdm-adph
digibuccwhat about a comparable card from NVidias lineup last generation? idk would that be a 295?

the 480 is not better all around than the 295 is it?
Shhh -- he's in the zone.
Posted on Reply
#163
newtekie1
Semi-Retired Folder
digibuccwhat about a comparable card from NVidias lineup last generation? idk would that be a 295?

the 480 is not better all around than the 295 is it?
Actually, yes it is performance wise. However, the GTX295 has better power and heat output than the GTX480.
AzureOfTheSkyso newtelie1, the reports that are floating around the net of people with 1+kw psu's (quility units like pcp&c,fortron,silverstone, and tt toughpower) blowing after people left them looping 3dmark/furmark/heaven when they say went to take a shower dont matter?

The thinking in each of these cases is that its probbly due to the card continuing to draw more and more power the longer its under load, one review i found showed 2 480's(reviewer also tested sli but it a seprate revies) each of them drew 340watts and hit 101c after being left running 3dmark or furmark or heaven for an extended time, it wasnt hours but the guy did leave it running a good while to simulate a real gaming session playing a stressfull game in a common case(think it was an antec 900 or something like that)

the fact is, as the TPU review shows, the card can/does pull more power then nvidia wants to admit, it runs hot even at idle, and really dosnt perform that great if you get right for its specs.

1. an overclocked 5780 will be faster then a 480, even an overclocked one(they dont overclock well)
2. even overclocked the 5780 couldnt draw as much power or create as much heat as the 480.

3. in idle the 5k cards use VERY LITTLE POWER and run VERY COOL

nVidia should have TESTED b4 they went to mass production, I know amd/ati do, after the 2900 they went back to testing b4 they sent cards/chips to mass production, they ensure they will be able to keep them within a reasonable power/heat threshold had nV done proper testing b4 they started mass production they could have avoided
1. having such a hot card thats well below the planned specs.
2. having such a high fail rate on cores (make something hard to produce and its going to have higher fail rates)
3. being 6 months late to market trying to work around heat and production problems.

yes the 5k cards had problems to, BUT notice they got worked out and yeilds are well above 7.1%

nV screwed up, I would guess they know they screwed up and are working hard to get out either a refresh OR a re-designed product that wont be so damn hot.

I wonder if even a water cooler like the older toxic cards used could keep these things heat in check....i have a feeling it would take a 120mm rad or dual 120mm rad even to do the job...
I won't believe the power supply stories until I see some real sources backing it up. I find it hard to believe that we've seen so many reviews doing the exact same thing, including W1z's, and not a single reviewer had an issue. Not to mention we've seen cards in the past that have drawn even more power than the GTX480.

Yes, I've already gone over that compared to the HD5870 the GTX480 isn't as good power and heat wise, even the GTX470 isn't. But again, that is because the HD5870 set an amazingly high bar. Really, compared to past cards, the GTX480 isn't terrible, it isn't great or even really good, but it isn't terrible heat and power wise.

And I find it funny that you talk about ATi never releasing another hot and problem card after the HD2900 series, because I seem to remember the HD4850 issues with heat, and furmark killing the cards...but yeah, ATi tests everything really well before they release it... Don't post BS, ATi isn't perfect like you seem to think they are.
Posted on Reply
#164
mdm-adph
newtekie1Actually, yes it is performance wise. However, the GTX295 has better power and heat output than the GTX480.
Oops! Nope, not at the highest resolutions demanded by the most uber-elite enthusiasts among us, no. I find this interesting, too, since doesn't the GTX295 technically have a smaller frame buffer? (1792MB/2)

Posted on Reply
#166
mdm-adph
digibucc100-101 lol
Still not "all around better." ;)
Posted on Reply
#167
newtekie1
Semi-Retired Folder
mdm-adphOops! Nope, not at the highest resolutions demanded by the most uber-elite enthusiasts among us, no. I find this interesting, too, since doesn't the GTX295 technically have a smaller frame buffer? (1792MB/2)

tpucdn.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/images/perfrel_2560.gif
Oops! Yep, overall it is.



One resolution doesn't matter, overall is what matters, next you'll be telling us we should pick whichever benchmark ATi did best in, and use that as the final word on which card is better...
Posted on Reply
#168
mdm-adph
newtekie1Oops! Yep, overall it is.

tpucdn.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/images/perfrel.gif

One resolution doesn't matter, overall is what matters, next you'll be telling us we should pick whichever benchmark ATi did best in, and use that as the final word on which card is better...
Hey -- you said "all-around." Obviously, if I'm gaming at 2560x1600, the GTX480 is not better than the previous generation. You can't pick and choose your stats.

Next you'll be saying the GTX480 is best because green is a pretty color.
Posted on Reply
#169
newtekie1
Semi-Retired Folder
mdm-adphHey -- you said "all-around." Obviously, if I'm gaming at 2560x1600, the GTX480 is not better than the previous generation. You can't pick and choose your stats.

Next you'll be saying the GTX480 is best because green is a pretty color.
You don't know what all-around better means do you?

All around means: considering all aspects. Looking at one resolution isn't considering all aspects now it is?

And you are the one trying to pick and choose stats here, when I use overall performance, I'M USING ALL THE STATS! Guess what picking one resolution is...I'll tell you...it is picking and choosing your stats.
Posted on Reply
#170
mdm-adph
newtekie1You don't know what all-around better means do you?

All around means: considering all aspects. Looking at one resolution isn't considering all aspects now it is?

And you are the one trying to pick and choose stats here, when I use overall performance, I'M USING ALL THE STATS! Guess what picking one resolution is...I'll tell you...it is picking and choosing your stats.
Funny -- "all-around better," to me, means "better in every way." And obviously it's not, not to mention considering the heat it'll probably be an RMA in a few months, anyway.

But, please, continue with the defense of Nvidia. It's cute, now that the tides have turned. :laugh:
Posted on Reply
#171
Velvet Wafer
newtekie1You don't know what all-around better means do you?

All around means: considering all aspects. Looking at one resolution isn't considering all aspects now it is?
really newtekie...then we would also have to consider temperatures and wattage, and that wouldnt be all too good for fermi;)
come on, Nvidia cant always win!:)
this time, they fucked up, not ATI... they did that in the old hd2900 days, for exact the same reason.... trying to keep up with the fastest cards available (which were the g80 series at that time;))without thinking first
Posted on Reply
#172
cdawall
where the hell are my stars
newtekie1You don't know what all-around better means do you?

All around means: considering all aspects. Looking at one resolution isn't considering all aspects now it is?

And you are the one trying to pick and choose stats here, when I use overall performance, I'M USING ALL THE STATS! Guess what picking one resolution is...I'll tell you...it is picking and choosing your stats.
Why not average things people use than cause overall includes 1024x768, 1280x1024 neither of those is used by someone who has a gtx480 take out useless benchmarks and the 480 starts to fall back. It should shine at high res but it doesn't hell I can't even render better than the 4870x2 is pulls more power and pushes more heat than a dualie card.
Posted on Reply
#173
newtekie1
Semi-Retired Folder
mdm-adphFunny -- "all-around better," to me, means "better in every way." And obviously it's not, not to mention considering the heat it'll probably be an RMA in a few months, anyway.

But, please, continue with the defense of Nvidia. It's cute, now that the tides have turned. :laugh:
Re-read some of my posts, I'm hardly defending nVidia. In the shadow of RV870, Fermi doesn't look good at all. I'm just putting in in perspective a little, because really I think Fermi is taking more flak than it really deserves.
Velvet Waferreally newtekie...then we would also have to consider temperatures and wattage, and that wouldnt be all too good for fermi;)
come on, Nvidia cant always win!:)
this time, they fucked up, not ATI... they did that in the old hd2900 days, for exact the same reason.... trying to keep up with the fastest cards available (which were the g80 series at that time;))without thinking first
We were discussing pure performance. Yes, to decide what is the better card, every aspect needs to be addressed. RV870 without a doubt is the better GPU when considering every aspect.
cdawallWhy not average things people use than cause overall includes 1024x768, 1280x1024 neither of those is used by someone who has a gtx480 take out useless benchmarks and the 480 starts to fall back. It should shine at high res but it doesn't hell I can't even render better than the 4870x2 is pulls more power and pushes more heat than a dualie card.
You certainly can rule out some resolutions, but then that wouldn't be overall performance. When buying a card, I'd personally look directly at the resolution I was using, and nothing else. However, in a discussion, I'm going to use overall performance because people use the different resolutions.

And I'm willing to bet that the higher resolution issues will be worked out with drivers, and there are pretty obviously driver issues still involved with the initial driver release.
Posted on Reply
#174
cdawall
where the hell are my stars
newtekie1You certainly can rule out some resolutions, but then that wouldn't be overall performance. When buying a card, I'd personally look directly at the resolution I was using, and nothing else. However, in a discussion, I'm going to use overall performance because people use the different resolutions.

And I'm willing to bet that the higher resolution issues will be worked out with drivers, and there are pretty obviously driver issues still involved with the initial driver release.
And there are plenty of pretty obvious design issues with fermi that will probably get worked out with a new card but that's against the point this is what we have now and it looks shitty in performance per watt and heat output both of which hugely effect overall rig performance and extra 300~400watts of heat has to go somewhere in the case and heat rises that just so happens to be cpu land in most cases. Can your cpu cooler handle loosing a good chunk of its cooling capacity due to hot air vs cool air?
Posted on Reply
#175
eidairaman1
The Exiled Airman
Velvet Waferreally newtekie...then we would also have to consider temperatures and wattage, and that wouldnt be all too good for fermi;)
come on, Nvidia cant always win!:)
this time, they fucked up, not ATI... they did that in the old hd2900 days, for exact the same reason.... trying to keep up with the fastest cards available (which were the g80 series at that time;))without thinking first
and at that time ATI was going thru a Merger with AMD, what is Nvidia going through is what I'd like to know
Posted on Reply
Add your own comment
Apr 26th, 2024 03:50 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts