Thursday, February 25th 2010

GeForce GTX 400 Series Performance Expectations Hit the Web

A little earlier this month, NVIDIA tweeted that it would formally unveil the GeForce GTX 400 series graphics cards, NVIDIA's DirectX 11 generation GPUs, at the PAX East gaming event in Boston (MA), United States, on the 26th of March. That's a little under a month's time from now. In its run up, sources that have access to samples of the graphics cards seem to be drawing their "performance expectations" among other details tricking in.

Both the GeForce GTX 480 and GTX 470 graphics cards are based on NVIDIA's GF100 silicon, which physically packs 512 CUDA cores, 16 geometry units, 64 TMUs, 48 ROPs, and a 384-bit GDDR5 memory interface. While the GTX 480 is a full-featured part, the GTX 470 is slightly watered-down, with probably 448 or 480 CUDA cores enabled, and a slightly narrower memory interface, probably 320-bit GDDR5. Sources tell DonanimHaber that the GeForce GTX 470 performs somewhere between the ATI Radeon HD 5850 and Radeon HD 5870. This part is said to have a power draw of 300W. The GeForce GTX 480, on the other hand, is expected to perform on-par with the GeForce GTX 295 - at least in existing (present-generation) applications. A recent listing by an online store for a pre-order, put the GTX 480 at US $699.Source: DonanimHaber
Add your own comment

114 Comments on GeForce GTX 400 Series Performance Expectations Hit the Web

#1
aCid888*
If the specs and performance are to be believed then I think the old saying 'You snooze, you lose' comes to mind. :shadedshu
Posted on Reply
#2
TheLaughingMan
by: mdm-adph
Well, that's the way it goes. Nvidia enjoyed their time on the top, and now the crown's passed to ATI/AMD.

3-4 years from now, the GTX 600 series will come out, and will take it back.

Now do you see why a lack of competition is bad? :p Nvidia's cards were the fastest for so long, that they apparently skimped on R&D and got left behind.
Failure of proper R&D is not a result of lack of competition. Strangely high prices, skewed price/performance ratio, lack of advertisement, all definately. Lack of R&D is just them being cheap dicks.

ATI was not exactly non-existent and has taken the performance crown from them before on several occasions in the past few years. This is just a failure on Nvidia's part through and through.
Posted on Reply
#3
eidairaman1
fact of failing chips in the G92, Rebagging of the 8800 series, failure of the Mobility parts, and now the longest delay of Nvs G100 is all cause of NV to have big migraines.
Posted on Reply
#4
ucanmandaa
I think fermi will earn its place in history alongside FX 5800 Ultra and R600 (2900xt) as a hot, power hungry and pricy (price/performance wise) card... I could even mention Voodoo5 6000 but it was never released officially.
Also all the cards mentioned above were released after delays... except Voodoo ofcourse
Posted on Reply
#5
TheLaughingMan
Sad. The card has not even been released yet, and we are all going, "Sorry Green Team, maybe next time." I find it quite funny actually.

Bad Nvidia, bad.
Posted on Reply
#6
..'Ant'..
Well if does turn out to be that bad then im going to get another GTX 285 to SLI then.
Posted on Reply
#7
[I.R.A]_FBi
by: DanishDevil
Guess the enthusiasts will be waiting for the GTX495 then.
on two separate cards?
Posted on Reply
#8
aj28
by: [I.R.A]_FBi
on two separate cards?
Funny because it's true. Of course, they won't need to if it's able to best the 5970, but I think it's a well-accepted fact that AMD has more than enough room to drop prices on the entire HD5000 series without exception and beat nVidia into the ground. Question is, why would they? Must the same reason Intel doesn't want to do the same to AMD. It's not about anti-trust so much as it is keeping a healthy buffer space so everyone makes money. We'll see some good price cuts over the summer I think, but it's not going to be 50%.
Posted on Reply
#9
mooch37
I love how some people say $700 for the Fermi is way too expensive (and it is) but then they say it's so much cheaper to go with a 5970. They're practically the same price. Maybe a $50 difference, but that's about it.
Posted on Reply
#10
eidairaman1
could care less, Im going with 2 5890s when they arrive
Posted on Reply
#11
qwerty_lesh
can someone post a tl;dr version of all the comments pl0x
say summarized in bullets or the likes

tyvm kthxbye.
Posted on Reply
#12
TheLaughingMan
by: mooch37
I love how some people say $700 for the Fermi is way too expensive (and it is) but then they say it's so much cheaper to go with a 5970. They're practically the same price. Maybe a $50 difference, but that's about it.
If the 470 is expected to fall between the 5850 and 5870, then it is safe to assume the 480 will be above the 5870, but may not reach 5970's performance. If it does not at least match it, $50 is a big deal to pay for a brand name and less performance.

Now lets say the performance is only 5 to 10% better than a 5870, then your gap is more like $225. That is huge for so little a boost.
Posted on Reply
#13
HossHuge
by: the54thvoid

Remember how good the 4870 was and then NV went 'BAM!!' GTX 280. All they do is punch with a bigger glove.
How could that be when the GTX280 came out a week before the 4870?
Posted on Reply
#14
Wile E
Power User
Man, I hope those wattage claims aren't true. This thing will be a heat monster if that's the case.
Posted on Reply
#15
LAN_deRf_HA
I don't see why anyone would be surprised. It's very common for the new flagship card to trade blows with the dual gpu card of last round, not completely surpass it.
Posted on Reply
#16
eidairaman1
by: LAN_deRf_HA
I don't see why anyone would be surprised. It's very common for the new flagship card to trade blows with the dual gpu card of last round, not completely surpass it.
zzzzzz
Posted on Reply
#17
shevanel
I DONT MISS THE GTX 275! What im trying to say is I will never own another HOT HOT card .. I hate sweating.
Posted on Reply
#18
a_ump
Ok so it was stated the GTX 470 is to have 300watt tdp and performance between HD 5850 and HD 5870. So how in the hell is the GTX 480, with the full 512 CUDA cores, likely higher clocks, going to be within PCIe TDP specifications? and how is it going to soar up to HD 5970 performance when the GTX 470 only slightly crippled, is between the HD 5850 and HD 5870. For the GTX 480 to compete with the HD 5970 it'd have to be a lil more than double the performance of the GTX 470...which obviously isn't possible. The only way it'd be possible is with much higher clocks, but with the GTX 470 at 300TDP already, there's no room for significant clock increases.

EDIT: someone above said no waving of victory flags yet, but how can you not? Nvidia can't take the crown, can't beat ATI in wattage, and MOST of all can't beat them in price as ATI can definitely drop price more than nvidia can. Nvidia is going to beat in all aspects if you truly think about it as i've laid it out. Only way for fermi to be relevant in the gaming arena is for them to sell the GTX 4XX at ridiculously low prices and take the hit, and by low prices i mean low.
Posted on Reply
#19
buggalugs
Im soory but im gonna LOLs at people who waited for fermi. ATI already learnt the wide memory bus with GDDR5 doesnt work well. Nvidia are like 3 years behind where ATI are now and they are making the same mistakes as ati did with RV600.
Posted on Reply
#20
dogchainx
I'm going to LOL at people who start bashing each other because of their purchases and loyalty to a corporation.
Posted on Reply
#21
Wile E
Power User
by: buggalugs
Im soory but im gonna LOLs at people who waited for fermi. ATI already learnt the wide memory bus with GDDR5 doesnt work well. Nvidia are like 3 years behind where ATI are now and they are making the same mistakes as ati did with RV600.
Your statement about wide memory buses makes no sense whatsoever. GDDR5 works no different with wide buses than it does on narrow ones. More bus width just adds bandwidth. More bandwidth does not hurt. It may not be needed, but it certainly doesn't hurt anything.

Even with these rumors abound, I'm still waiting until fermi releases to see what happens in the market.

I LOL at people that make decisions based on rumors.
Posted on Reply
#22
_33
http://www.techreport.com/discussions.x/18525
We now know that Nvidia will officially announce its GeForce GTX 480 and 470 graphics cards on March 26. Only some of Nvidia's card partners may be at the party, however. DigiTimes has learned from anonymous sources that most of Nvidia's second-tier partners still haven't received "complete reference board designs."

Nvidia reportedly intends to prioritize "first-tier makers or makers that only produce Nvidia cards." As DigiTimes points out, XFX and PNY versions of the upcoming GF100 cards have already shown up for pre-order in the United States, so those partners will presumably be among those receiving preferential treatment. Cards from tier-two manufacturers may not start shipping until April.

On a separate note, DigiTimes writes that "some market watchers" don't see a price war between Nvidia and AMD occurring until after May. At issue are those pre-order listings, which showed price tags of $679.99 for the GeForce GTX 480 and $499.99 for the GeForce GTX 470. AMD only has one card in that price range: the dual-GPU Radeon HD 5970, which starts at around $650 and doesn't seem to be very widely available.

If you're waiting for lower-end GF100 derivatives to fight it out with AMD's mainstream Radeon HD 5000-series cards, well, you might want to be patient. When asked about such GPUs recently, Nvidia CEO Jen-Hsun Huang stated that current-generation GeForces are "fabulous" and will "continue to do quite nicely in the marketplace." He also suggested that mainstream users may not need DirectX 11 cards to begin with, although he did also promise a quick transition to newer products.
Posted on Reply
#23
wahdangun
by: Wile E
Man, I hope those wattage claims aren't true. This thing will be a heat monster if that's the case.
that's why, the chase must have fermi certification, that's have wind tunnel in it :laugh:
Posted on Reply
#24
buggalugs
by: Wile E
Your statement about wide memory buses makes no sense whatsoever. GDDR5 works no different with wide buses than it does on narrow ones. More bus width just adds bandwidth. More bandwidth does not hurt. It may not be needed, but it certainly doesn't hurt anything.

.
Yes it does. I remember reading about it. Something about diminishing returns and high power draw required to keep it fed.

Thats why after the R600 debacle(with 512bit bus) ATI moved back to 256bit bus with 3870/4870 and now the very powerful 5870 still has a 256bit memory bus.

If it were that easy or worthwhile ATI would have made the 4870 or the upgraded 4890 with a wider memory bus. They already tried it and it wasnt worth it. Nvidia has been using GDDR3 which benefits from a wider memory bus. On GDDR 5 theres already plenty of bandwidth on a 256 bit bus.

As an analogy its like having 4 X 5970's in a computer. After 2 of them theres not much performance increase if any. So you have a hot and power hungry setup that is inefficient. Just like RV600 was.

I'm not surprised the power draw is as high as they say.
Posted on Reply
#25
Imsochobo
by: ucanmandaa
I think fermi will earn its place in history alongside FX 5800 Ultra and R600 (2900xt) as a hot, power hungry and pricy (price/performance wise) card... I could even mention Voodoo5 6000 but it was never released officially.
Also all the cards mentioned above were released after delays... except Voodoo ofcourse
2900 XT was something, FX series so wasnt.

But your mostly right, 2900 series was nothing for the average performance oriented consumer, but it did beat EVERYthing when watercooled! :D
The clocks it achieved was so staggering :D talking by experience, sole reason why i bought it.

anyhow, this card might end up like the X1800/1900 series, except being hot aswell, ati had expensive parts, but faster.

Nvidia enjoyed those times, but this time it might be the other way around, i so not doubt that fermi will be fast.
but they aint gonna blow ati away with the fermi, far from, may just stay in the game, and maybe prove themself next time.
Posted on Reply
Add your own comment