Thursday, April 8th 2010

NVIDIA Designs GeForce GTX 460

NVIDIA is said to be working on another GF100-based SKU, which further cuts down from the GeForce GTX 470. The new SKU, GeForce GTX 460, is expected to rival ATI Radeon HD 5850 in the sub-$300 market segment. The GF100 will be configured to have a 256-bit wide GDDR5 memory interface, driving 1024 MB (1 GB) of memory. Clock speeds could be another thing NVIDIA uses as the determining factor. The GTX 460 will not have a reference design as such, and AIC partners will be allowed to release their own designs from day one. The new SKU is expected to release in June. Pictured below is the GTX 470 PCB.

Source: Expreview
Add your own comment

58 Comments on NVIDIA Designs GeForce GTX 460

#1
newtekie1
Semi-Retired Folder
HalfAHertz said:
If you can pawn it off to someone looking for sli for ~260$(100$ below retail), you can get 10% extra perf. and Dx11 for just 50$ :p How is that not worth it? Come on, give in to the urge, listen to the little faint voice inside your head. DO IT! UPGRADE!
DX11 at this point is worthless to me, no games actually use it with maybe the exception of Metro 2033, and the games that claim to use it don't look any better than when they are running in DX10.

And you aren't making it any easier...I've had the upgrade itch for months now...I am getting bored with what I have...:laugh: But I've decided I'm skipping this round of cards and waiting for the revamps in 6-12 months...I can hold out that long...I think...:roll:

vagxtr said:
Yep but GTX470 will have "sufficent enough" 320 memory bus. With further reduction all we could see is large drops like in g92 with 192-bit bus. Only if they use some HD5770 like gddr with 6Gbps and not 4Gbps like in GTX480 we could saw improvements. But it's very unlikely to happen maybe they even cut that down to 3200MHz GDDR5. So the memory along only 1/2 of ROPs will be a huge issue here no matter how many shaders are inside. Nice way for envydia to collect money of gullible fans. :nutkick: Just like with HD5830 as Zubasa mentioned.
At a 256-bit bus, it would match HD5870 memory bandwidth assuming the memory is running at the same speed, which I be it will be considering less memory chips generally means faster memory is used, not too bad really. It would also cut it down to 32 ROPs, matching the HD5870 again, also not bad.

I would see it as more of the differences between G80 and G92, G92 was essentially the same architecture, but with a 256-bit memory bus instead of 384/320, and it had 16 ROPs instead of 24, yet G92 was a very sucessful and powerful GPU.

And your 192-bit argument doesn't work on me, as I actually loved those cards. The 8800GS/9600GSO we're some of my favorite cards, I still run one in one of my PCs actually. The provided amazing performance for their price.
Posted on Reply
#2
Wile E
Power User
newtekie1 said:
DX11 at this point is worthless to me, no games actually use it with maybe the exception of Metro 2033, and the games that claim to use it don't look any better than when they are running in DX10.

And you aren't making it any easier...I've had the upgrade itch for months now...I am getting bored with what I have...:laugh: But I've decided I'm skipping this round of cards and waiting for the revamps in 6-12 months...I can hold out that long...I think...:roll:



At a 256-bit bus, it would match HD5870 memory bandwidth assuming the memory is running at the same speed, which I be it will be considering less memory chips generally means faster memory is used, not too bad really. It would also cut it down to 32 ROPs, matching the HD5870 again, also not bad.

I would see it as more of the differences between G80 and G92, G92 was essentially the same architecture, but with a 256-bit memory bus instead of 384/320, and it had 16 ROPs instead of 24, yet G92 was a very sucessful and powerful GPU.

And your 192-bit argument doesn't work on me, as I actually loved those cards. The 8800GS/9600GSO we're some of my favorite cards, I still run one in one of my PCs actually. The provided amazing performance for their price.
I feel you pain newtekie, I feel your pain. lol. I'm itching to upgrade, but I feel this generation of cards has no performance benefits for me, as my performance is still pretty top notch. I would essentially just be upgrading my features.
Posted on Reply
#3
Grings
I am also holding on for the refreshed cards later this year.

I just hope they're big refreshes like the x1800-x1900/7800gt-7900gt with extra shaders and whatever else could be planned (considering 5000mhz gddr5 is cheap enough to put on 5750's, i'd like to see what bandwidth the 480's 384 bit bus could muster at that speed, especially if they get enough good chips to actually use all 512 shaders)

Im kinda hoping that this happens at the end of the year though, i still need to buy an i7 board, chip and ddr3 yet, im not sure whether to wait for more sata3/usb3 boards to come out, or get a Gigabyte ud-5 or Asus P6X58D Premium, i see the rampage III is available for preorder a few places, but it's silly money.

*fakeedit* ooh look, tatty has a Walking Ban stick, good choice tpu overlords, grats!
Posted on Reply
#4
Wile E
Power User
Grings said:
I am also holding on for the refreshed cards later this year.

I just hope they're big refreshes like the x1800-x1900/7800gt-7900gt with extra shaders and whatever else could be planned (considering 5000mhz gddr5 is cheap enough to put on 5750's, i'd like to see what bandwidth the 480's 384 bit bus could muster at that speed, especially if they get enough good chips to actually use all 512 shaders)

Im kinda hoping that this happens at the end of the year though, i still need to buy an i7 board, chip and ddr3 yet, im not sure whether to wait for more sata3/usb3 boards to come out, or get a Gigabyte ud-5 or Asus P6X58D Premium, i see the rampage III is available for preorder a few places, but it's silly money.

*fakeedit* ooh look, tatty has a Walking Ban stick, good choice tpu overlords, grats!
The SATA3/USB 3 boards are out. And Gigabyte is killing in the X58 market. Any GB board is good.

And yeah, I'm waiting for the die shrinks and additional shaders. I want to see what a die shrunk 512 shader Fermi can do.

Or even what ATI has up it's sleeve. Even more shaders on a die shrink, perhaps?
Posted on Reply
#5
a_ump
well shit, i hope nvidia actually score with this card. i mean they need to put some pressure on ATI somewhere i want some hardcore leaked info on w/e ATI's going to release next lol as well as lower prices.
Posted on Reply
#6
TheMailMan78
Big Member
Wile E said:
I feel you pain newtekie, I feel your pain. lol. I'm itching to upgrade, but I feel this generation of cards has no performance benefits for me, as my performance is still pretty top notch. I would essentially just be upgrading my features.
Just wait for the next generation guys. I have a feeling Nvidia is going to bring the pain. I really think this was Nvidias 2900 but I think they will skip the 3870 stage and go right for the 4850 stage! :rockout:

If they do Ill go Nvidia next upgrade. I swear it.
Posted on Reply
#7
newtekie1
Semi-Retired Folder
TheMailMan78 said:
Just wait for the next generation guys. I have a feeling Nvidia is going to bring the pain. I really think this was Nvidias 2900 but I think they will skip the 3870 stage and go right for the 4850 stage! :rockout:

If they do Ill go Nvidia next upgrade. I swear it.
It is just another G80, in fact it is very similar. Both had higher than normal memory busses for their time, both were huge dies for their time, both ran very hot, and both sucked up huge amounts of power for their time.

I mean, everyone freaked out at the 8800GTX requiring TWO 6-pin power connectors. And they ran at close to 90°C too. And yet, G80 was praised for being an amazing GPU...

The only difference is that this time around ATi's GPU isn't worse, its better.

And as for relevence to this thread, I'd like to point out that the most successful G80 card was the cut down 8800GTS...I wonder if the most successful Fermi card will be the cut down GTX460.

I expect the next revision to be similar to the G80 -> G92 revision, similar but more modest specs with much better power and heat characteristics and much better clock speeds.
Posted on Reply
#8
a_ump
newtekie1 said:
It is just another G80, in fact it is very similar. Both had higher than normal memory busses for their time, both were huge dies for their time, both ran very hot, and both sucked up huge amounts of power for their time.

I mean, everyone freaked out at the 8800GTX requiring TWO 6-pin power connectors. And they ran at close to 90°C too. And yet, G80 was praised for being an amazing GPU...

The only difference is that this time around ATi's GPU isn't worse, its better.

And as for relevence to this thread, I'd like to point out that the most successful G80 card was the cut down 8800GTS...I wonder if the most successful Fermi card will be the cut down GTX460.

I expect the next revision to be similar to the G80 -> G92 revision, similar but more modest specs with much better power and heat characteristics and much better clock speeds.
i hope its a bigger jump than G80 to G92. G92 was a better and more refined core overall but the 9800GTX's performance wasn't better than the 8800GTX's at higher res's. even 1680x1050 sometimes. I'd like for the revision to reduce heat n power but have a performance jump like the HD 4890 did to the HD 4870, would be better marketing wise.
Posted on Reply