• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

XFX Abandons GeForce GTX 400 Series

fermi= http://techreport.com/articles.x/4966/2 redux.......

anybody who thinks nVidia didnt loose this round is either an insain fanboi or mentally defective

History repeats itself for nVidia.

FX 5800 / GTX 480
- Reduce availability.
- Not quite a performance star.
- Heat and noise problems.
- Late.

The difference is that now nVidia buyers have pieces of paper moving around in games (Physx), 3D glasses (if you can afford the whole kit), can work with some serious C programming and video encoding (CUDA). Nice features though. Ah, I was forgetting "The Way It's Meant to Be Played" extra value.
 
History repeats itself for nVidia.

FX 5800 / GTX 480
- Reduce availability.
- Not quite a performance star.
- Heat and noise problems.
- Late.

The difference is that now nVidia buyers have pieces of paper moving around in games (Physx), 3D glasses (if you can afford the whole kit), can work with some serious C programming and video encoding (CUDA). Nice features though. Ah, I was forgetting "The Way It's Meant to Be Played" extra value.
I wouldn't say TWIMTP is "extra value", in the end the money comes from who pay for these cards. :shadedshu
We both know that "The Way It's Meant to Be Payed" won't save the GF100 from the Hemlock. :p
 
History repeats itself for nVidia.

FX 5800 / GTX 480
- Reduce availability.
- Not quite a performance star.
- Heat and noise problems.
- Late.

The difference is that now nVidia buyers have pieces of paper moving around in games (Physx), 3D glasses (if you can afford the whole kit), can work with some serious C programming and video encoding (CUDA). Nice features though. Ah, I was forgetting "The Way It's Meant to Be Played" extra value.
They got the performance crown (within the framework of single gpu video cards) but that's all they got. And not by a huge margin. By any other metric the card is a disaster. It seems like they sacrificed everything just to get the performance crown, as if that's the only thing people wanted to hear...
 
maybe ATI did something to them like paying them for that.... just saying..
 
maybe ATI did something to them like paying them for that.... just saying..

Thank you conspiracy theorist for your doubts. Reality just doesn't seem real, right?
 
I wouldn't say TWIMTP is "extra value", in the end the money comes from who pay for these cards. :shadedshu
We both know that "The Way It's Meant to Be Payed" won't save the GF100 from the Hemlock. :p

I've met people who thought that just because X game had the nVidia "It's Meant to Be Played" logo their nVidia cards were superior to ATi on that particular game. Of course, I'm talking about people who still think that having more video memory is all that matters :slap:
 
I don't remember worrying much about the HD2900's running hot, I believe my issues with them were that they were overpriced. I've always gone with the best bang for the buck. I'll be the first one to point out the problems with the GTX480 also, in fact I did later on down in the post. However, again, my buying decision almost always goes with bang for the buck, and that is pretty much what I am always concerned with.



For someone that just a few posts back went on about people not calling you an ATi fan boy because you've own nVidia cards too...you sure are quick to do the exact same thing to others...:slap:

Did you happen to look at what card is currently in my main rig? An HD4890. Guess what card was in Rig4 before the 8800GTS...and x800xl(bought here from Xazax), and before that was an x1950Pro(sold to Crashnburnxp). I've also had an x1900GT(bought from Blacktruckryder), which was replaced with an HD3850(bought from Xazax), which was replaced by an HD4670(sold to 3dsage), which was now replaced by the HD4890. I just purchased an HD4870x2 off miahallen. About the only series I haven't personally owned a card from was the HD2000 series, which I skipped, and to be fair, I skipped the G80 series also, I wasn't interested in either as both didn't offer enough gains over the cards I currently had to justify the price. The only reason I have a G80 card now is because I got a good deal on it, and I wanted it to replace the x800xl so I could use the machine to fold.



No, the GTX480 doesn't make sense. However, as I already said, the GTX470 is actually looking promising. Granted, I won't totally believe that until we see a W1z review on it, but from the other reviews it is looking promising. It doesn't use an extreme amount of power like the GTX480(so no worry about killing power supplies), it does still get hot but it uses a much weaker heatsink and fan than the GTX480 also. The heat output is only about 60w more than the HD5870, which actually does an amazing job in heat and power usage. However, the interesting thing is that the GTX470's heat output and power usage is very much inline with my current HD4890 and GTX285, and it is actually very similar to the GTX280. I don't need to worry about water-cooling my CPU with those cards, so I'm not worried about it with the GTX470. I think most, including you, seem to be caught up on the GTX480 and applying the problems it has with GTX470. However, it is pretty obvious that nVidia really pushed that card to make it one hell of a beast so it would beat the HD5870 hands down, while the scaled back GTX470 is actually a reasonable card. The HD5870 is better in heat and power, but the GTX470 certainly isn't unreasonable, it compares rather nicely to the high end cards of the last generation actually, the HD5870 has just sets an extremly high bar.

This is a point I've been trying to make repeatedly. I've pointed out numerous times that the 4870x2 actually draws MORE power in everything (except BD playback) than the GTX480.

ATI just hit a homerun with the 5k series. That doesn't make Fermi a bad card. Not a great card, but not a bad card.
 
epically if you need a loud heater for your computer room/house on those cold winter nights....oh wait....its almost summer..........
 
Right, lets get this straight.....

2, 55nm, 1.3v, 4870 gpus run a little warmer and draw a little more power than

1, 40nm, .99v, GTX 480 gpu

Somehow, I don't see anything there that makes the GTX 480 a decent card. Let alone a good card. Not to mention that in performance a 4870x2 is close to or equal a 5870. That just makes the GTX 480 look worse imo.

I guess you don't have to worry about SLI profiles with one gpu. I guess that's a plus for Fermi.........right?
 
epically if you need a loud heater for your computer room/house on those cold winter nights....oh wait....its almost summer..........

I have whole house AC, and it would be cooler than my X2 anyway.

Right, lets get this straight.....

2, 55nm, 1.3v, 4870 gpus run a little warmer and draw a little more power than

1, 40nm, .99v, GTX 480 gpu

Somehow, I don't see anything there that makes the GTX 480 a decent card. Let alone a good card. Not to mention that in performance a 4870x2 is close to or equal a 5870. That just makes the GTX 480 look worse imo.

I guess you don't have to worry about SLI profiles with one gpu. I guess that's a plus for Fermi.........right?

Fermi outperforms both the 5870 an the 4870x2. Voltage specs don't matter, only the end results. In this, the end result is a whole lot of wattage and heat. lol.

I don't see anything that makes Fermi a terrible card. I likewise don't really see anything that makes it a great card.
 
Imagine if they cannot offer warranty even for 1 year, what's the quality of those boards. And XFX is known for their 5 years+ warranty, even lifetime, so it is very understandable why they choose not to market the new cards from Nvidia.

Also, a card that "might" only last 2 or 3 years is not worth buying anyways.
 
Fermi outperforms both the 5870 an the 4870x2. Voltage specs don't matter, only the end results. In this, the end result is a whole lot of wattage and heat. lol.

I don't see anything that makes Fermi a terrible card. I likewise don't really see anything that makes it a great card.

Outperforms.....slightly.

Maybe it does a lot more in benches, but since I play games I didn't really look at the "Vantage" parts of all the reviews I read.

BTW, I read somewhere that a couple reviewers got bum GTX 480s. Still trying to find them though.

Fermi isn't terrible.....but it came close.
 
Bad news for nVidia. they are slowly slipping away. if nVidia continues on this path they will end up as 3DFX did

Nvidia buyout 3Dfx:slap:
 
Right, lets get this straight.....

2, 55nm, 1.3v, 4870 gpus run a little warmer and draw a little more power than

1, 40nm, .99v, GTX 480 gpu

Somehow, I don't see anything there that makes the GTX 480 a decent card. Let alone a good card. Not to mention that in performance a 4870x2 is close to or equal a 5870. That just makes the GTX 480 look worse imo.

I guess you don't have to worry about SLI profiles with one gpu. I guess that's a plus for Fermi.........right?

My 300nm pentium 1 ran on 5v (or was it 2,5v) and it was passively cooled. What's your point?
 
Fermi outperforms both the 5870 an the 4870x2. Voltage specs don't matter, only the end results. In this, the end result is a whole lot of wattage and heat. lol.

I don't see anything that makes Fermi a terrible card. I likewise don't really see anything that makes it a great card.

It doesn't outperform them by all that much - especially considering how hyped the card was . . . as well, the slight margin it has over the other two seems to dwindle quick when the resolution goes up. I mean, considering just how much nVidia was hyping this card, I was expecting performance close to (if not better than) the 5970.

TBH, I can't say it's a bad card, either . . . but I don't really see where it's a better deal over a 5870.
 
This is a point I've been trying to make repeatedly. I've pointed out numerous times that the 4870x2 actually draws MORE power in everything (except BD playback) than the GTX480.

ATI just hit a homerun with the 5k series. That doesn't make Fermi a bad card. Not a great card, but not a bad card.

I think we are going to see that change something is wrong with how these cards got measured I have already seen reports of 3 1000w+ psu getting killed with fermi upgrades one tt toughpower 1200w one fsp bluetop 1000w and one fsp 1200w something isn't right here and it need to get figured out now cause even the 4870x2 wasn't blowing psu
 
I think we are going to see that change something is wrong with how these cards got measured I have already seen reports of 3 1000w+ psu getting killed with fermi upgrades one tt toughpower 1200w one fsp bluetop 1000w and one fsp 1200w something isn't right here and it need to get figured out now cause even the 4870x2 wasn't blowing psu

HAH... wow. Thats is insane.
 
HAH... wow. Thats is insane.

One card popped running 3dmark looped which is odd maybe as they produce more heat they become less effecient pulling more than the ~300w reviews have been putting them at. That's what I think is happening and the cards are overdrawing multirail psu's since most multirail systems set the cb higher than the psu can actually handle if multiple rails max out this has led to overloaded highend psu's but only the multirail ones.
 
So when are we going to see the GTX480 Coop addition with a G92 for PhysX! lmao.
 
One card popped running 3dmark looped which is odd maybe as they produce more heat they become less effecient pulling more than the ~300w reviews have been putting them at. That's what I think is happening and the cards are overdrawing multirail psu's since most multirail systems set the cb higher than the psu can actually handle if multiple rails max out this has led to overloaded highend psu's but only the multirail ones.

Yeah I always use single rails, but still...

Theyre definitely pushing the line on the manufacturing process. For sure they sent cherry picked samples to reviewers. Not all cards draw the same amount of juice am i right? Some chips might be leakier/run hotter/ burn more watts than others.

Sounds like it might be a QC issue. Was it all the same brand of card?
 
Yeah I always use single rails, but still...

Theyre definitely pushing the line on the manufacturing process. For sure they sent cherry picked samples to reviewers. Not all cards draw the same amount of juice am i right? Some chips might be leakier/run hotter/ burn more watts than others.

Sounds like it might be a QC issue. Was it all the same brand of card?

Ill ask the shop they got returned to, but I do not believe they were. Some of the reviewers show higher loads as well I have seen anywere from 300 to 340 and up
 
300-340, teh variance is great
 
That's a good differnece haha that's like a 8400gs extra in the system

Does the speed/way in which a card demands power also make a difference?... i.e. If i plug something into my PSU that instantly demands 700W, and then the next nanosecond, 100W, and then 850W again in a few ms later, repeatedly. Does that make any difference on the strain for the psu instead of just demanding 700W continuously?
 
Back
Top