Tuesday, March 30th 2010

XFX Abandons GeForce GTX 400 Series

XFX is getting cozier with AMD by the day, which is an eyesore for NVIDIA. Amidst the launch of GeForce GTX 400 series, XFX did what could have been unimaginable a few months ago: abandon NVIDIA's high-end GPU launch. That's right, XFX has decided against making and selling GeForce GTX 480 and GeForce GTX 470 graphics cards, saying that it favours high-end GPUs from AMD, instead. This comes even as XFX seemed to have been ready with its own product art. Apart from making new non-reference design SKUs for pretty-much every Radeon HD 5000 series GPU, the company is working on even more premium graphics cards targeted at NVIDIA's high-end GPUs.

The rift between XFX and NVIDIA became quite apparent when XFX outright bashed NVIDIA's high-end lineup in a recent press communication about a new high-end Radeon-based graphics card it's designing. "XFX have always developed the most powerful, versatile Gaming weapons in the world - and have just stepped up to the gaming plate and launched something spectacular that may well literally blow the current NVIDIA offerings clean away," adding "GTX480 and GTX470 are upon us, but perhaps the time has come to Ferm up who really has the big Guns." The move may come to the disappointment of some potential buyers of GTX 400 series, as XFX's popular Double Lifetime Warranty scheme would be missed. XFX however, maintains that it may choose to work on lower-end Fermi-derivatives.

Source: HardwareCanucks
Add your own comment

199 Comments on XFX Abandons GeForce GTX 400 Series

#1
xtremesv
AzureOfTheSky said:
fermi= http://techreport.com/articles.x/4966/2 redux.......

anybody who thinks nVidia didnt loose this round is either an insain fanboi or mentally defective
History repeats itself for nVidia.

FX 5800 / GTX 480
- Reduce availability.
- Not quite a performance star.
- Heat and noise problems.
- Late.

The difference is that now nVidia buyers have pieces of paper moving around in games (Physx), 3D glasses (if you can afford the whole kit), can work with some serious C programming and video encoding (CUDA). Nice features though. Ah, I was forgetting "The Way It's Meant to Be Played" extra value.
Posted on Reply
#2
Zubasa
xtremesv said:
History repeats itself for nVidia.

FX 5800 / GTX 480
- Reduce availability.
- Not quite a performance star.
- Heat and noise problems.
- Late.

The difference is that now nVidia buyers have pieces of paper moving around in games (Physx), 3D glasses (if you can afford the whole kit), can work with some serious C programming and video encoding (CUDA). Nice features though. Ah, I was forgetting "The Way It's Meant to Be Played" extra value.
I wouldn't say TWIMTP is "extra value", in the end the money comes from who pay for these cards. :shadedshu
We both know that "The Way It's Meant to Be Payed" won't save the GF100 from the Hemlock. :p
Posted on Reply
#3
theubersmurf
xtremesv said:
History repeats itself for nVidia.

FX 5800 / GTX 480
- Reduce availability.
- Not quite a performance star.
- Heat and noise problems.
- Late.

The difference is that now nVidia buyers have pieces of paper moving around in games (Physx), 3D glasses (if you can afford the whole kit), can work with some serious C programming and video encoding (CUDA). Nice features though. Ah, I was forgetting "The Way It's Meant to Be Played" extra value.
They got the performance crown (within the framework of single gpu video cards) but that's all they got. And not by a huge margin. By any other metric the card is a disaster. It seems like they sacrificed everything just to get the performance crown, as if that's the only thing people wanted to hear...
Posted on Reply
#4
idx
maybe ATI did something to them like paying them for that.... just saying..
Posted on Reply
#5
Binge
Overclocking Surrealism
idx said:
maybe ATI did something to them like paying them for that.... just saying..
Thank you conspiracy theorist for your doubts. Reality just doesn't seem real, right?
Posted on Reply
#6
xtremesv
Zubasa said:
I wouldn't say TWIMTP is "extra value", in the end the money comes from who pay for these cards. :shadedshu
We both know that "The Way It's Meant to Be Payed" won't save the GF100 from the Hemlock. :p
I've met people who thought that just because X game had the nVidia "It's Meant to Be Played" logo their nVidia cards were superior to ATi on that particular game. Of course, I'm talking about people who still think that having more video memory is all that matters :slap:
Posted on Reply
#7
Wile E
Power User
newtekie1 said:
I don't remember worrying much about the HD2900's running hot, I believe my issues with them were that they were overpriced. I've always gone with the best bang for the buck. I'll be the first one to point out the problems with the GTX480 also, in fact I did later on down in the post. However, again, my buying decision almost always goes with bang for the buck, and that is pretty much what I am always concerned with.



For someone that just a few posts back went on about people not calling you an ATi fan boy because you've own nVidia cards too...you sure are quick to do the exact same thing to others...:slap:

Did you happen to look at what card is currently in my main rig? An HD4890. Guess what card was in Rig4 before the 8800GTS...and x800xl(bought here from Xazax), and before that was an x1950Pro(sold to Crashnburnxp). I've also had an x1900GT(bought from Blacktruckryder), which was replaced with an HD3850(bought from Xazax), which was replaced by an HD4670(sold to 3dsage), which was now replaced by the HD4890. I just purchased an HD4870x2 off miahallen. About the only series I haven't personally owned a card from was the HD2000 series, which I skipped, and to be fair, I skipped the G80 series also, I wasn't interested in either as both didn't offer enough gains over the cards I currently had to justify the price. The only reason I have a G80 card now is because I got a good deal on it, and I wanted it to replace the x800xl so I could use the machine to fold.



No, the GTX480 doesn't make sense. However, as I already said, the GTX470 is actually looking promising. Granted, I won't totally believe that until we see a W1z review on it, but from the other reviews it is looking promising. It doesn't use an extreme amount of power like the GTX480(so no worry about killing power supplies), it does still get hot but it uses a much weaker heatsink and fan than the GTX480 also. The heat output is only about 60w more than the HD5870, which actually does an amazing job in heat and power usage. However, the interesting thing is that the GTX470's heat output and power usage is very much inline with my current HD4890 and GTX285, and it is actually very similar to the GTX280. I don't need to worry about water-cooling my CPU with those cards, so I'm not worried about it with the GTX470. I think most, including you, seem to be caught up on the GTX480 and applying the problems it has with GTX470. However, it is pretty obvious that nVidia really pushed that card to make it one hell of a beast so it would beat the HD5870 hands down, while the scaled back GTX470 is actually a reasonable card. The HD5870 is better in heat and power, but the GTX470 certainly isn't unreasonable, it compares rather nicely to the high end cards of the last generation actually, the HD5870 has just sets an extremly high bar.
This is a point I've been trying to make repeatedly. I've pointed out numerous times that the 4870x2 actually draws MORE power in everything (except BD playback) than the GTX480.

ATI just hit a homerun with the 5k series. That doesn't make Fermi a bad card. Not a great card, but not a bad card.
Posted on Reply
#8
AzureOfTheSky
epically if you need a loud heater for your computer room/house on those cold winter nights....oh wait....its almost summer..........
Posted on Reply
#9
mastrdrver
Right, lets get this straight.....

2, 55nm, 1.3v, 4870 gpus run a little warmer and draw a little more power than

1, 40nm, .99v, GTX 480 gpu

Somehow, I don't see anything there that makes the GTX 480 a decent card. Let alone a good card. Not to mention that in performance a 4870x2 is close to or equal a 5870. That just makes the GTX 480 look worse imo.

I guess you don't have to worry about SLI profiles with one gpu. I guess that's a plus for Fermi.........right?
Posted on Reply
#10
Wile E
Power User
AzureOfTheSky said:
epically if you need a loud heater for your computer room/house on those cold winter nights....oh wait....its almost summer..........
I have whole house AC, and it would be cooler than my X2 anyway.

mastrdrver said:
Right, lets get this straight.....

2, 55nm, 1.3v, 4870 gpus run a little warmer and draw a little more power than

1, 40nm, .99v, GTX 480 gpu

Somehow, I don't see anything there that makes the GTX 480 a decent card. Let alone a good card. Not to mention that in performance a 4870x2 is close to or equal a 5870. That just makes the GTX 480 look worse imo.

I guess you don't have to worry about SLI profiles with one gpu. I guess that's a plus for Fermi.........right?
Fermi outperforms both the 5870 an the 4870x2. Voltage specs don't matter, only the end results. In this, the end result is a whole lot of wattage and heat. lol.

I don't see anything that makes Fermi a terrible card. I likewise don't really see anything that makes it a great card.
Posted on Reply
#11
Unregistered
Imagine if they cannot offer warranty even for 1 year, what's the quality of those boards. And XFX is known for their 5 years+ warranty, even lifetime, so it is very understandable why they choose not to market the new cards from Nvidia.

Also, a card that "might" only last 2 or 3 years is not worth buying anyways.
#12
mastrdrver
Wile E said:
Fermi outperforms both the 5870 an the 4870x2. Voltage specs don't matter, only the end results. In this, the end result is a whole lot of wattage and heat. lol.

I don't see anything that makes Fermi a terrible card. I likewise don't really see anything that makes it a great card.
Outperforms.....slightly.

Maybe it does a lot more in benches, but since I play games I didn't really look at the "Vantage" parts of all the reviews I read.

BTW, I read somewhere that a couple reviewers got bum GTX 480s. Still trying to find them though.

Fermi isn't terrible.....but it came close.
Posted on Reply
#13
segalaw19800
mtosev said:
Bad news for nVidia. they are slowly slipping away. if nVidia continues on this path they will end up as 3DFX did
Nvidia buyout 3Dfx:slap:
Posted on Reply
#14
HalfAHertz
mastrdrver said:
Right, lets get this straight.....

2, 55nm, 1.3v, 4870 gpus run a little warmer and draw a little more power than

1, 40nm, .99v, GTX 480 gpu

Somehow, I don't see anything there that makes the GTX 480 a decent card. Let alone a good card. Not to mention that in performance a 4870x2 is close to or equal a 5870. That just makes the GTX 480 look worse imo.

I guess you don't have to worry about SLI profiles with one gpu. I guess that's a plus for Fermi.........right?
My 300nm pentium 1 ran on 5v (or was it 2,5v) and it was passively cooled. What's your point?
Posted on Reply
#15
imperialreign
Wile E said:
Fermi outperforms both the 5870 an the 4870x2. Voltage specs don't matter, only the end results. In this, the end result is a whole lot of wattage and heat. lol.

I don't see anything that makes Fermi a terrible card. I likewise don't really see anything that makes it a great card.
It doesn't outperform them by all that much - especially considering how hyped the card was . . . as well, the slight margin it has over the other two seems to dwindle quick when the resolution goes up. I mean, considering just how much nVidia was hyping this card, I was expecting performance close to (if not better than) the 5970.

TBH, I can't say it's a bad card, either . . . but I don't really see where it's a better deal over a 5870.
Posted on Reply
#16
cdawall
where the hell are my stars
Wile E said:
This is a point I've been trying to make repeatedly. I've pointed out numerous times that the 4870x2 actually draws MORE power in everything (except BD playback) than the GTX480.

ATI just hit a homerun with the 5k series. That doesn't make Fermi a bad card. Not a great card, but not a bad card.
I think we are going to see that change something is wrong with how these cards got measured I have already seen reports of 3 1000w+ psu getting killed with fermi upgrades one tt toughpower 1200w one fsp bluetop 1000w and one fsp 1200w something isn't right here and it need to get figured out now cause even the 4870x2 wasn't blowing psu
Posted on Reply
#17
phanbuey
cdawall said:
I think we are going to see that change something is wrong with how these cards got measured I have already seen reports of 3 1000w+ psu getting killed with fermi upgrades one tt toughpower 1200w one fsp bluetop 1000w and one fsp 1200w something isn't right here and it need to get figured out now cause even the 4870x2 wasn't blowing psu
HAH... wow. Thats is insane.
Posted on Reply
#18
cdawall
where the hell are my stars
phanbuey said:
HAH... wow. Thats is insane.
One card popped running 3dmark looped which is odd maybe as they produce more heat they become less effecient pulling more than the ~300w reviews have been putting them at. That's what I think is happening and the cards are overdrawing multirail psu's since most multirail systems set the cb higher than the psu can actually handle if multiple rails max out this has led to overloaded highend psu's but only the multirail ones.
Posted on Reply
#19
DarthCyclonis
So when are we going to see the GTX480 Coop addition with a G92 for PhysX! lmao.
Posted on Reply
#20
phanbuey
cdawall said:
One card popped running 3dmark looped which is odd maybe as they produce more heat they become less effecient pulling more than the ~300w reviews have been putting them at. That's what I think is happening and the cards are overdrawing multirail psu's since most multirail systems set the cb higher than the psu can actually handle if multiple rails max out this has led to overloaded highend psu's but only the multirail ones.
Yeah I always use single rails, but still...

Theyre definitely pushing the line on the manufacturing process. For sure they sent cherry picked samples to reviewers. Not all cards draw the same amount of juice am i right? Some chips might be leakier/run hotter/ burn more watts than others.

Sounds like it might be a QC issue. Was it all the same brand of card?
Posted on Reply
#21
cdawall
where the hell are my stars
phanbuey said:
Yeah I always use single rails, but still...

Theyre definitely pushing the line on the manufacturing process. For sure they sent cherry picked samples to reviewers. Not all cards draw the same amount of juice am i right? Some chips might be leakier/run hotter/ burn more watts than others.

Sounds like it might be a QC issue. Was it all the same brand of card?
Ill ask the shop they got returned to, but I do not believe they were. Some of the reviewers show higher loads as well I have seen anywere from 300 to 340 and up
Posted on Reply
#23
cdawall
where the hell are my stars
[I.R.A]_FBi said:
300-340, teh variance is great
That's a good differnece haha that's like a 8400gs extra in the system
Posted on Reply
#24
phanbuey
cdawall said:
That's a good differnece haha that's like a 8400gs extra in the system
Does the speed/way in which a card demands power also make a difference?... i.e. If i plug something into my PSU that instantly demands 700W, and then the next nanosecond, 100W, and then 850W again in a few ms later, repeatedly. Does that make any difference on the strain for the psu instead of just demanding 700W continuously?
Posted on Reply
#25
newtekie1
Semi-Retired Folder
mastrdrver said:
Right, lets get this straight.....

2, 55nm, 1.3v, 4870 gpus run a little warmer and draw a little more power than

1, 40nm, .99v, GTX 480 gpu

Somehow, I don't see anything there that makes the GTX 480 a decent card. Let alone a good card. Not to mention that in performance a 4870x2 is close to or equal a 5870. That just makes the GTX 480 look worse imo.

I guess you don't have to worry about SLI profiles with one gpu. I guess that's a plus for Fermi.........right?
I don't see why that matters exactly? The end result is, for the less heat and power, Fermi provides more performance then ATi's last generation's high end card. In any other situation that would have been praised as amazing. It is only in the shadow of RV870 that Fermi doesn't look great.

Here is an interesting little tidbit of information: Not once has ATi been able to release a single GPU card that actually outperformed every card from the previous generation. This includes the HD5870. However, nVidia has with Fermi.

You know, it kind of makes me wonder what the power and heat of RV870 would be like if they did push it to that level of performance...
Posted on Reply
Add your own comment