Thursday, August 21st 2008

NVIDIA to Upgrade GeForce GTX 260 with 24 Additional Shaders

In a move that can be seen as retaliation to the HD 4870 variations that come with high-performance cores and up to 1 GB of GDDR5 memory and preparations to counter an upcoming Radeon HD 4850 X2, NVIDIA has decided to give the GeForce GTX 260 an upgrade with an additional Texture Processing Cluster (TPC) enabled in the GTX 260 G200 core. The original GTX 260 graphics processor (GPU) had 8 TPCs, (24 x 8 = 192 SPs), the updated core will have 9 TPCs, that amounts to an additional 24 shader processors, which should increase the core's shader compute power significantly over merely increasing frequencies. It is unclear at this point as to what the resulting product would be called.

Everything else remains the same with frequencies, memory size, memory bus width. This upgrade could take shape by this September.Source: Expreview
Add your own comment

86 Comments on NVIDIA to Upgrade GeForce GTX 260 with 24 Additional Shaders

#1
jbunch07
so where the shaders already there just not activated?
Posted on Reply
#3
EastCoasthandle
People who already own 4800 series or 200 series card now can pretty much max out all settings with 4xAA/16xAF and, in most cases I believe it's over 50 FPS however, it depends on their native resolution. So IMO I don't see this as being a popular card. Besides, I couldn't imagine those who just purchased a 4800 series or 200 series buying this. And, I would be curious to know if people with 260s could actually stepup. If they can I would imagine the bulk of sales would come from that IMO.
Posted on Reply
#5
EastCoasthandle
by: jbunch07
i see that the gtx280 and gtx260 have the same transistor count.
Are you saying that the 280 and 260 are essential the same? They disabled some features on the 280 and called it a 260. Then later re-enabled some features claiming that they added 24 shaders on the other 260?
Posted on Reply
#6
jbunch07
by: EastCoasthandle
Are you saying that the 280 and 260 are essential the same? They disabled some features on the 280 and called it a 260. Then later re-enabled some features claiming that they added 24 shaders on the other 260?
thats what it looks like to me.
Posted on Reply
#7

by: jbunch07
thats what it looks like to me.
that'll be a knife in Nvidias back if too many people notice that.
Posted on Edit | Reply
#8
EastCoasthandle
by: jbunch07
thats what it looks like to me.
This will be very interesting if it turns out to be true. I can only guess that some 260 owners wouldn't like this (if they couldn't setup).
Posted on Reply
#9
candle_86
thats what Nvidia always does people. How do you think the 8800GTS 640 112 was made?
Posted on Reply
#10
Sasqui
by: kyle2020
that'll be a knife in Nvidias back if too many people notice that.
I'd be pissed if I had paid for a 260 already!
Posted on Reply
#11
jbunch07
by: EastCoasthandle
This will be very interesting if it turns out to be true. I can only guess that some 260 owners wouldn't like this (if they couldn't setup).
well i dont think it would be the first time something like this has happened.
Posted on Reply
#12
Darkrealms
Go figure I just ordered a 260 yesterday. Oh well.
Thanks for the info BTA.
Posted on Reply
#13
Kursah
by: jbunch07
thats what it looks like to me.
Yeah, that's what I was wondering too...but how they've disabled the shaders/cores is what I've been curious about...there's no way to get 1GB of memory tho since the extra chip(s) are missing, but if a GTX260 could get the same ammount of shaders as a GTX280, I wouldn't complain!

Though I'd rather see the 200b's released in 260 and 280 flavors sooner than later instead of a shader increase imo..but either way some extra performance wouldn't hurt, especially if prices stay similar or decline to keep things competetive. I may have a couple step-up options from EVGA coming soon then!

:toast:
Posted on Reply
#14

by: Sasqui
I'd be pissed if I had paid for a 260 already!
so would I. Seems like companys enjoy doing this sort of thing, and the faithful buers will always buy from them. Heads out the sand people! Brand loyalty is out the window! :D
Posted on Edit | Reply
#15
EastCoasthandle
by: jbunch07
well i dont think it would be the first time something like this has happened.
True but during the G80 era there was no real competition. So it flew under the radar as an acceptable practice.


by: kyle2020
so would I. Seems like companies enjoy doing this sort of thing, and the faithful buyers will always buy from them. Heads out the sand people! Brand loyalty is out the window! :D
Well, that's the whole point of being indoctrinated...I think...
Posted on Reply
#16
candle_86
You buy on release you get burned. How do you think 8800GTS users felt when the 8800GTS with 112 shaders instead of 96 popped up?
Posted on Reply
#17
jbunch07
by: Kursah
Yeah, that's what I was wondering too...but how they've disabled the shaders/cores is what I've been curious about...there's no way to get 1GB of memory tho since the extra chip(s) are missing, but if a GTX260 could get the same ammount of shaders as a GTX280, I wouldn't complain!

Though I'd rather see the 200b's released in 260 and 280 flavors sooner than later instead of a shader increase imo..but either way some extra performance wouldn't hurt, especially if prices stay similar or decline to keep things competetive. I may have a couple step-up options from EVGA coming soon then!

:toast:
I just wanna know what is involved in enabling those extra shaders? if someone could do it themselves?
Posted on Reply
#18
Kursah
by: jbunch07
I just wanna know what is involved in enabling those extra shaders? if someone could do it themselves?
We'll find out when they show up! If it's a simple BIOS tweak or a change in GTX260 fab process, or what the deal is...too bad it's not just a driver tweak! I suppose it could be...but doubtfully.

I wonder if they'll run the GTX260's at 1.18v to maintain stability with more shaders or if they can keep them at 1.12v. My GTX runs nice and cool overall..even OC'd I hit 65C load at 1.12v, I think the hightest I hit at 1.18v was around 73C.
Posted on Reply
#19
jbunch07
by: Kursah
We'll find out when they show up! If it's a simple BIOS tweak or a change in GTX260 fab process, or what the deal is...too bad it's not just a driver tweak! I suppose it could be...but doubtfully.

I wonder if they'll run the GTX260's at 1.18v to maintain stability with more shaders or if they can keep them at 1.12v. My GTX runs nice and cool overall..even OC'd I hit 65C load at 1.12v, I think the hightest I hit at 1.18v was around 73C.
hmm i guess we will just have to wait and see what all involved.
unless someone on here knows the answer please stand up. :rockout:
Posted on Reply
#20
candle_86
prolly more complex, they where most likly laser cut
Posted on Reply
#21

by: EastCoasthandle
Are you saying that the 280 and 260 are essential the same? They disabled some features on the 280 and called it a 260. Then later re-enabled some features claiming that they added 24 shaders on the other 260?
I'm surprised how many people dont know this. But i read somewhere that the chips that Nvidia manufactures that have defects, they get put into cards like G80 8800GTS and GTX260 . The defective shaders or something like that is "disabled" . The perfect chips go into GTX280 and 8800gtx . This saves Nvidia money.
Posted on Edit | Reply
#22
newtekie1
Semi-Retired Folder
Not really surprising, and everyone needs to realize this has been a common practice in computer industry for decades. The processor manufactures do it, and so do the video card manufactures. ATi and nVidia have been cutting down cores to make lower cards for a very long time, so don't get in a big huff about it now.

Though I hope nVidia doesn't keep the GTX260 name, I would prefer GTX270 or GTX265 to keep confusion down.

by: wolf2009
I'm surprised how many people dont know this. But i read somewhere that the chips that Nvidia manufactures that have defects, they get put into cards like G80 8800GTS and GTX260 . The defective shaders or something like that is "disabled" . The perfect chips go into GTX280 and 8800gtx . This saves Nvidia money.
Yep, exactly. And the G92's that are defective get put in 8800GT's and 8800GS's(9600GSO's). Ati didn't do it with the RV670, but they did with RV600. The 2900GT was just a defective RV600 with the defective shaders turned off.

Intel and AMD use similar techniques with their processors. The original E6300 and E6400 were just defective Conroe cores that had the defective parts of the L2 cache disabled. Same thing with the Celerons and Penium E2000 series, they are just Allendale cores(from the E4000 series) with the defective parts of the L2 cache disabled. The Celeron 400 series are Allendale cores with an entire processing core disabled to give the appearence of only having a single core processor.

AMD does this too, some of their single core processors are really dual-core processor with a defective core turned off, they started doing this at the end of the Socket 939 era.

by: jbunch07
I just wanna know what is involved in enabling those extra shaders? if someone could do it themselves?
No, you can't do it yourselves, nVidia(and ATi) long stopped this by physically breaking the connection on the die itself.
Posted on Reply
#23
jbunch07
by: candle_86
prolly more complex, they where most likly laser cut
i got some lasers! ;)
Posted on Reply
#24
btarunr
Editor & Senior Moderator
not "defective", just the ones that happen to perform lower when binning compared what's required to make it to a GTX 280.
Posted on Reply
#25
jbunch07
by: newtekie1
Not really surprising, and everyone needs to realize this has been a common practice in computer industry for decades. The processor manufactures do it, and so do the video card manufactures. ATi and nVidia have been cutting down cores to make lower cards for a very long time, so don't get in a big huff about it now.

Though I hope nVidia doesn't keep the GTX260 name, I would prefer GTX270 or GTX265 to keep confusion down.



No, you can't do it yourselves, nVidia(and ATi) long stopped this by physically breaking the connection on the die itself.
thats what i figured.

oh and i was well aware that cpu manufactures have been doing it for quite some time now, i guess i just didn't really think of them doing it quite as much with video cards. but seems i was wrong.
Posted on Reply
Add your own comment