Monday, June 21st 2010

NVIDIA GF104 Package Pictured

One of the first pictures of NVIDIA's upcoming GF104 graphics processor has come to light, with a Chinese source picturing a GF104 qualification sample. The sample is based on the A1 silicon. The GPU package is similar to that of the GF100, it makes use of an integrated heat-spreader (IHS) to disperse heat from the die underneath it. The package is rectangular rather than square (probably a move to reduce board footprint, translating into more compact boards) The GPU is built on TSMC's 40 nm process, and is said to have significantly lower TDP compared to the GF100. One of the first SKUs built around it is the GeForce GTX 460.

Contrary to older reports, Expreview's report suggests that the GeForce GTX 460 will have 336 CUDA cores (instead of 256), and 768 MB of memory across a 192-bit GDDR5 memory interface. Its TDP is expected to be around the 150W mark, similar to that of a GeForce GTS 250. It will target price-point slightly above the $200 mark, while other SKUs carved out of this silicon will be lesser.
Sources: Expreview, Zol.com.cn
Add your own comment

30 Comments on NVIDIA GF104 Package Pictured

#1
newtekie1
Semi-Retired Folder
I hope this actually is a decent performer to add a little competition in the mid-range market.
Posted on Reply
#2
mstenholm
With 336 CUDA cores it will be a good folder for the money, provided the $200 will be the price.
Posted on Reply
#3
Imsochobo
newtekie1I hope this actually is a decent performer to add a little competition in the mid-range market.
Nvidia does have a slight shot with this.
Still shows, 150W its still a high number!
atleast when its about to be competing with 5770 (most likely be in between 5770 and 5830)

But its price does seem appealing, 336 cuda cores is acceptable, 192 bit ? 256 would be good on this chip if those cuda cores are as effective as GTX285's.
But those in 465 470 480 doesnt show that.

Only to wait n see, but pricepoint does look like a relif for me, push ati where it "hurts" them, ati have loads of headroom on prices.
Posted on Reply
#4
mdsx1950
Any chance of a dual GPU card?
Posted on Reply
#5
btarunr
Editor & Senior Moderator
ImsochoboNvidia does have a slight shot with this.
Still shows, 150W its still a high number!
atleast when its about to be competing with 5770 (most likely be in between 5770 and 5830)

But its price does seem appealing, 336 cuda cores is acceptable, 192 bit ? 256 would be good on this chip if those cuda cores are as effective as GTX285's.
But those in 465 470 480 doesnt show that.

Only to wait n see, but pricepoint does look like a relif for me, push ati where it "hurts" them, ati have loads of headroom on prices.
The 192-bit interface most likely is to reduce the pin-count. At 1000 MHz, 192-bit GDDR5 still has 96 GB/s, and at 1200 MHz it's 115 GB/s. That's plenty of bandwidth compared to the 76.8 GB/s HD 5770 gets @ 1200 MHz.
Posted on Reply
#6
Imsochobo
mdsx1950Any chance of a dual GPU card?
uhm, why would you want that, go 470-480 if you really want NV.
Ati pushes 5850 on 150 W so a 5970 is a way better deal. better effeciency AND more horsepower.

Galaxy will probably launch one, if you really really want one, but in a whiiile.

But it will just hit 300W max TDP with dualgpu @ stock speeds, or maybe 280.
Posted on Reply
#7
btarunr
Editor & Senior Moderator
mdsx1950Any chance of a dual GPU card?
That would just be on par with GTX 485.
Posted on Reply
#8
Unregistered
Yawn... call me when they reach 32nm with good performance to power ratio.
Posted on Edit | Reply
#9
the_wolf88
WTF ?!

I read this article 3 days ago and TPU just now put it !!

I thought there is more information about this chip but at the end I found it is same as the one on expreview site !

Anyway still I think it draws a lot of power !!

150W for performance like maybe 5770 ?! Too much !
Posted on Reply
#10
phanbuey
hopefully sometime this decade they will release a card sub $200 that matches the performance of a 275.

Performance/dollar nvidia hasn't moved a whole lot in a very long time...
Posted on Reply
#11
KainXS
well seeing as a GTX465 is about the performance of a GTX285, maybe this one will do that

its pretty obvious things did not go as planned in nvidia land this year, maybe next year .
Posted on Reply
#12
Imsochobo
KainXSwell seeing as a GTX465 is about the performance of a GTX285, maybe this one will do that

its pretty obvious things did not go as planned in nvidia land this year, maybe next year .
I belive in their arch.
But its too early :P
Posted on Reply
#13
RadeonProVega
why 768MB with 192bit?

Why not 1GB w/256bit or 2GB w 256bit?
Posted on Reply
#14
newtekie1
Semi-Retired Folder
u2konlinewhy 768MB with 192bit?

Why not 1GB w/256bit or 2GB w 256bit?
The mid-range GPU is usually half the high end, so 384 / 2 = 192.
Posted on Reply
#15
slyfox2151
u2konlinewhy 768MB with 192bit?

Why not 1GB w/256bit or 2GB w 256bit?
Price,

that much ram + higher bus would cost a lot more to make for very little performance gain.
1gb + is just not needed for the performance of this card.
Posted on Reply
#16
overclocking101
yep nvidia had to cut the ram in half to keep price down im sure they will eventually release a 1gb vairiant. is it me or does nvidia consistantly release cards with odd amounts or ram?? 768,896,1700,1500 why not even is it because of the biger memory bus??
Posted on Reply
#17
kid41212003
768MB is a perfect amount for gamers.

512MB is not good enough for 1680x1050 anymore.
Posted on Reply
#18
slyfox2151
overclocking101yep nvidia had to cut the ram in half to keep price down im sure they will eventually release a 1gb vairiant. is it me or does nvidia consistantly release cards with odd amounts or ram?? 768,896,1700,1500 why not even is it because of the biger memory bus??
odd numbers are cooler :P

makes them stand out more.
Posted on Reply
#19
1c3d0g
I still feel NVIDIA could do a lot better on their GPU TDP numbers, but I also realize that it's probably not totally their fault. TSMC has been screwing up one d*mn process node after the other. Hopefully Global Foundries will give us all much cooler (and faster) running chips. :)
Posted on Reply
#20
TheLostSwede
News Editor
wo ai amd = I love amd...

Now that's hillarious :D
Posted on Reply
#21
Unregistered
1c3d0gI still feel NVIDIA could do a lot better on their GPU TDP numbers, but I also realize that it's probably not totally their fault. TSMC has been screwing up one d*mn process node after the other. Hopefully Global Foundries will give us all much cooler (and faster) running chips. :)
i don't think so, if thats true then ati will be power hungry and less efficient.

and nvdia design is creaming inefficient, just look at GTX 465, it have more shader than GTX 285 but in th end it still can't beat em and consume more power.

btw I'm totally hope nvdia can bring dual GPU solution thats why ati will drop the price


and i want price war again, god dammit, just look right now 6+ month and the price is still above MSRP, its just pathetic and not cool :cry::cry:
Posted on Edit | Reply
#22
btarunr
Editor & Senior Moderator
u2konlinewhy 768MB with 192bit?

Why not 1GB w/256bit or 2GB w 256bit?
There is always scope for partners to give out 1536 MB models.
Posted on Reply
#23
mdsx1950
Imsochobouhm, why would you want that, go 470-480 if you really want NV.
Ati pushes 5850 on 150 W so a 5970 is a way better deal. better effeciency AND more horsepower.

Galaxy will probably launch one, if you really really want one, but in a whiiile.

But it will just hit 300W max TDP with dualgpu @ stock speeds, or maybe 280.
Why would I do that lol?? I got 2x 5970s. :laugh:

And yeah galaxy's lanching a dual 470 card. My cards can beat two GTX 480s so dual 470s is out of the picture.

I was just wondering when they would release the 4xx series version of the GTX 295.
btarunrThat would just be on par with GTX 485.
What i actually meant was, when will their ultimate dual gpu card come out. Not a midrange dual card combination.
Posted on Reply
#24
Fourstaff
mdsx1950What i actually meant was, when will their ultimate dual gpu card come out. Not a midrange dual card combination.
I highly doubt that it will happen, their top of the range chips eat 300w, even with a revision which takes out 100w we will still end up with a 400w dual chip card.
Posted on Reply
#25
mdsx1950
FourstaffI highly doubt that it will happen, their top of the range chips eat 300w, even with a revision which takes out 100w we will still end up with a 400w dual chip card.
Well with the 6xxx series coming out H1 2011... nVidia will probably have to put out one hell of a card to compete.
Posted on Reply
Add your own comment
Apr 19th, 2024 15:46 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts