Monday, August 9th 2010

GF100 512 Core Graphics Card Tested Against GeForce GTX 480

NVIDIA seems to have overcome initial hiccups with the GF100 graphics processor, and could release a new graphics card that makes use of all 512 CUDA cores, and 64 TMUs on the GPU. The GeForce GTX 480 was initially released as a top SKU based on the GF100, with 480 out of the 512 CUDA cores enabled. What NVIDIA calls the new SKU is subject to some speculation. While GPU-Z screenshots show that the 512 core model has the same device ID (hence the same name, GeForce GTX 480), leading us to believe that this is a specifications update for the same SKU à la GeForce GTX 260 (216 SP), it seems possible that the release-grade models could carry a different device ID and name.

Expreview carried out a couple of tests on the 512 core "GTX 480" graphics card, and compared it to the 480 core model that's out in the market. NVIDIA GeForce 258.96 drivers were used. The 512 core card got a GPU Score of 10,072 points compared to 9,521 points of the 480 core card, in 3DMark Vantage Extreme preset. The additional TMUs showed an evident impact on the texture fillrate, 41.55 GTexel/s for the 512 core card against 38.82 GTexel/s for the 480 core card.
In the second test, Crysis Warhead, with Enthusiast preset, 1920 x 1080 px, and 8x AA, the 512 core card churned out a framerate of 34.72 fps, while the 480 core card trailed at 32.96 fps. In this short bench, the 512 core laden GF100 card is 5~6% faster than the GeForce GTX 480. If NVIDIA manages to release the SKU at the same price-point as the GTX 480 as it did with the GTX 260-216, it will increase NVIDIA's competitiveness further against AMD's ATI Radeon HD 5970, which is still the fastest graphics SKU in the market. Below are screenshot comparing scores of both cards.
Source: Expreview
Add your own comment

90 Comments on GF100 512 Core Graphics Card Tested Against GeForce GTX 480

#51
EastCoasthandle
BenetanegiaAccording to empiric data on the net about how Fermi behaves, it is quite the opposite. Power draw vastly depends on the clock and very little in enabled/disabled parts.

For example, the GTX465 consumes almost as much as the GTX470 (~20w difference, which is a 10%) despite having 25% of the core disabled, but the GTX470 on the other hand consumes almost 100w less (50% difference) than GTX480 although it only has 7% of shader cores and 20% ROPs disabled. That discrepancy comes from clock difference and anyone who has ever OCed a GTX470 knows that. Conclusion: the power draw difference on the 512 part would be negligible.
That's a long way of saying it will consumer more :wtf:. In any case we will find out all the details if/when such a video card comes out and reviewed properly.
Posted on Reply
#52
Benetanegia
EastCoasthandleThat's a long way of saying it will consumer more :wtf:. In any case we will find out all the details if/when such a video card comes out and reviewed properly.
No that's a long way of saying it will not consume more, unless it's clocked higher. I was responding to the claim that power consumption increase will be higher than the performance increase, which is not true.

And if it's based on a revised part as has been suggested (revision number blurred) anything could happen. For example that the power draw of such part has the same perf/watt difference comapred to the current GTX480 as the GTX460 has with the GTX465, which would make such part consume less than the GTX470. At this point anything is posible.
Posted on Reply
#53
EastCoasthandle
BenetanegiaNo that's a long way of saying it will not consume more, unless it's clocked higher. I was responding to the claim that power consumption increase will be higher than the performance increase, which is not true.

And if it's based on a revised part as has been suggested (revision number blurred) anything could happen. For example that the power draw of such part has the same perf/watt difference comapred to the current GTX480 as the GTX460 has with the GTX465, which would make such part consume less than the GTX470. At this point anything is posible.
If the information about it so far is true it will consume more (final clocks, etc). We will see (that's if it does actually come out). No need to get upset about it. ;)
Posted on Reply
#54
Benetanegia
EastCoasthandleIf the information about it so far is true it will consume more (final clocks, etc).
And that again is a bold statement that contradicts every bit of information we have about GF100. All the info says it will have exactly the same clocks as the 480 SP model. Like I said there's little difference in power draw between the GTX465 and 470 and there's a 25% difference in enabled silicon. To be precise there's a 10% power difference. The 512 SP version will only have 3-4% more silicon enabled, so do the math, power difference would be 1-2%. And that's assuming everything in the card itself is the same: a revised PWM, revised PCB, revised cooler... all of them would have a far bigger impact on power consumption than the chip itself.

So no, you just connot affirm that it will consume more, based on the fact that it will have 4% more silicon enabled, because the slightest change to the card's design would make a much greater difference. And that's assuming these pics are not related to a new revision that includes all the optimizations made on GF104.
Posted on Reply
#55
DaedalusHelios
the54thvoidWhy?

Why oh why?

Nvidia already has the fastest single core GPU. (and hottest, loudest etc). Why would they use the GF100 for this? For 6-7% increase....

Unless it's a partner doing it and not NV?

Although if Southern Islands makes it out this year perhaps this is NV's attempt to dull down ATI's 5xxx series revision.

Two words - performance / watt. Thats all that counts. No point having a 6-7% faster card if it's technologically backwards in respect of power draw.
Performance is performance..... power draw is a different concern. Electricity is not expensive where I live. If "gpu A" is even 200watts more than "gpu B" it isn't going to cost me but maybe $5 more a month(half the cost of a nice sandwich) when gaming with it 4 hours a day. And I don't think it is a 200watt difference. So it isn't a big deal. Lower power draw would have been nice though. Also I am assuming you don't have a poorly ventilated case or a low wattage PSU to worry about. I don't have any GTX 4xx, and I have many ATi 5xxx series cards. It is not like I think Nvidia is doing a bad job. It is competition which is good for the consumer. We don't have to take sides like they are sports teams. ;)
Posted on Reply
#56
EastCoasthandle
BenetanegiaAnd that again is a bold statement that contradicts every bit of information we have about GF100. All the info says it will have exactly the same clocks as the 480 SP model. Like I said there's little difference in power draw between the GTX465 and 470 and there's a 25% difference in enabled silicon. To be precise there's a 10% power difference. The 512 SP version will only have 3-4% more silicon enabled, so do the math, power difference would be 1-2%. And that's assuming everything in the card itself is the same: a revised PWM, revised PCB, revised cooler... all of them would have a far bigger impact on power consumption than the chip itself.

So no, you just connot affirm that it will consume more, based on the fact that it will have 4% more silicon enabled, because the slightest change to the card's design would make a much greater difference. And that's assuming these pics are not related to a new revision that includes all the optimizations made on GF104.
Why are you arguing? I have no need to argue with you. But I can show you results.
source.
Power Consumption Idle
512c.....158w
480c.....141w

Power Consumption Load
512c.....644w
480c.....440w

Now it would have been silly of me to have gotten into an argument with you when no information was available at the time. However with a 1st peek I will await more reviews for confirmation.
Posted on Reply
#57
MadMan007
All the gamers here have lost sight of the fact that the full Fermi architecture does play a huge role in NVs strategic product outlook, it's just in HPC not gaming. Perhaps going forward NV will be smart and separate the two from the getgo even though that may be worse from a production standpoint.
Posted on Reply
#58
D4S4
now that power draw is plain epic. :eek:

but i don't believe it came from only unlocking the rest of the chip.
Posted on Reply
#59
HillBeast
EastCoasthandleWhy are you arguing? I have no need to argue with you. But I can show you results.
source.
Power Consumption Idle
512c.....158w
480c.....141w

Power Consumption Load
512c.....644w
480c.....440w

Now it would have been silly of me to have gotten into an argument with you when no information was available at the time. However with a 1st peek I will await more reviews for confirmation.
Based on my calculations, this card breaks PCI-e specifications. If we assume they are doing full system power consumption and that the normal Fermi uses about 320W of power (I think that's right for a GF100 GTX480), then if we take the power consumption load of the SP480 (440W) and take away the Fermi load (320W) = 120W for system, then if we take the system consumption away from the SP512 model (644 - 120W) THAT'S 524W!

HOLY CRAP! The card has 2 8 pin power connectors totalling 300W of power (don't go on to me about power supplies being able to supply more because that's irrelavent) and 75W from the PCI-e slot, that's still 149W over! How the heck can they justify such a huge leap in power consumption over the SP480 with such a small improvement in performance?

Fail. Epic fail. Uber epic ULTIMATE FAIL! GF100 is FAIL!!!
Posted on Reply
#60
pr0n Inspector
HillBeastBased on my calculations, this card breaks PCI-e specifications. If we assume they are doing full system power consumption and that the normal Fermi uses about 320W of power (I think that's right for a GF100 GTX480), then if we take the power consumption load of the SP480 (440W) and take away the Fermi load (320W) = 120W for system, then if we take the system consumption away from the SP512 model (644 - 120W) THAT'S 524W!

HOLY CRAP! The card has 2 8 pin power connectors totalling 300W of power (don't go on to me about power supplies being able to supply more because that's irrelavent) and 75W from the PCI-e slot, that's still 149W over! How the heck can they justify such a huge leap in power consumption over the SP480 with such a small improvement in performance?

Fail. Epic fail. Uber epic ULTIMATE FAIL! GF100 is FAIL!!!
Because some random Chinese website on the Internet is so reliable, right?

Also, the existence of 6+2 pin connectors means PSU manufacturers are already ignoring ATX specifications.
Posted on Reply
#61
Bjorn_Of_Iceland
dalekdukesboyand that takes a lot for someone to say when I see in his sig rig he has a gtx480 of all things!!
Myeah.. well afterall this is a tech forum where we discourse anything tech related intelligently.. Its just that sometimes, people get over zealously sentimental / rabidly hostile to some brand or belief, which imo is pretty pointless.
Posted on Reply
#62
newfellow
hmm, weird damn results there.. GPU score for a 1 HD5850 OC'd up is 18500 Points and feature scores are a lot higher. Now those cannot by any mean be true readings on screenshots. So, why is there like 9K/10K at the screenshots.
Posted on Reply
#63
slyfox2151
newfellowhmm, weird damn results there.. GPU score for a 1 HD5850 OC'd up is 18500 Points and feature scores are a lot higher. Now those cannot by any mean be true readings on screenshots. So, why is there like 9K/10K at the screenshots.
er... becouse its running in EXTREAM mode??

id like to see a 5850 get anywhere near 9000 points in extream
Posted on Reply
#64
btarunr
Editor & Senior Moderator
WhiteyDo you think the cooler will have heatpipes coming out the side still ?

Or something like the Galaxy vapour chamber cooling ?
I expect it to have the same cooling solution as the GTX 480, except with a more hair-trigger fan profile.
Posted on Reply
#65
HillBeast
pr0n InspectorBecause some random Chinese website on the Internet is so reliable, right?

Also, the existence of 6+2 pin connectors means PSU manufacturers are already ignoring ATX specifications.
Dude, I don't know what Chinese class you took, and I haven't even taken any form of Chinese lesson but I can read that just fine. No idea what you're on about with Chinese stuff.

Also I was talking about PCI-e specifications not ATX. If you would read my post right the first time, I wouldn't have to explain myself.
Posted on Reply
#66
claylomax
EastCoasthandleWhy are you arguing? I have no need to argue with you. But I can show you results.
source.
Power Consumption Idle
512c.....158w
480c.....141w

Power Consumption Load
512c.....644w
480c.....440w

Now it would have been silly of me to have gotten into an argument with you when no information was available at the time. However with a 1st peek I will await more reviews for confirmation.
No wonder. Just look at the load voltage. :laugh:
Posted on Reply
#67
HillBeast
claylomaxNo wonder. Just look at the load voltage. :laugh:
It's not that much of a difference.
Posted on Reply
#68
phanbuey
MadMan007All the gamers here have lost sight of the fact that the full Fermi architecture does play a huge role in NVs strategic product outlook, it's just in HPC not gaming. Perhaps going forward NV will be smart and separate the two from the getgo even though that may be worse from a production standpoint.
The whole point is to make the technologies converge. HPC requires computational power... gaming(essentially 3d rendering) requires computational power. Converging them makes much more sense than trying to design two different products that do what can potentially be the same thing.
Posted on Reply
#69
crazyeyesreaper
Not a Moderator
as for consumptions uh did anyone already forget cards are avaible that use 3 6pins or 3 6+2 pins that comes to 150+150+150+75 525w so there ya go problem solved
Posted on Reply
#70
HillBeast
crazyeyesreaperas for consumptions uh did anyone already forget cards are avaible that use 3 6pins or 3 6+2 pins that comes to 150+150+150+75 525w so there ya go problem solved
Yeah but in the review the sample had 2 8 pins AKA 2 6+2 pin connectors.
Posted on Reply
#71
Meizuman
HillBeastUm, what was it I was going to say? Oh right, "BWAHAHAHAHAHA to the fools who bought the GTX480 with the assumption it would be the most powerful card from this generation. Sucks to be you. The 512SP version is better."
OR

"BWAHAHAHAHAHA to the fools who didn't bought the GTX480 because the assumption it would be the most power-hungy card from this generation. Sucks to be you. The 512SP version is hotter!"

:wtf:
Posted on Reply
#72
CDdude55
Crazy 4 TPU!!!
MeizumanOR

"BWAHAHAHAHAHA to the fools who didn't bought the GTX480 because the assumption it would be the most power-hungy card from this generation. Sucks to be you. The 512SP version is hotter!"

:wtf:
Both those statements are wrong lol.(yours and HillBeast's)
Posted on Reply
#73
erocker
*
EastCoasthandleWhy are you arguing? I have no need to argue with you. But I can show you results.
source.
Power Consumption Idle
512c.....158w
480c.....141w

Power Consumption Load
512c.....644w
480c.....440w

Now it would have been silly of me to have gotten into an argument with you when no information was available at the time. However with a 1st peek I will await more reviews for confirmation.
204 extra watts for 32 extra "Cuda cores"?! Something isn't right. I wonder if this website took a standard GTX480 and got their hands on a 512 core bios.
Posted on Reply
Add your own comment
Apr 26th, 2024 19:12 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts