1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA to Upgrade GeForce GTX 260 with 24 Additional Shaders

Discussion in 'News' started by btarunr, Aug 21, 2008.

  1. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,553 (11.24/day)
    Thanks Received:
    13,644
    Location:
    Hyderabad, India
    In a move that can be seen as retaliation to the HD 4870 variations that come with high-performance cores and up to 1 GB of GDDR5 memory and preparations to counter an upcoming Radeon HD 4850 X2, NVIDIA has decided to give the GeForce GTX 260 an upgrade with an additional Texture Processing Cluster (TPC) enabled in the GTX 260 G200 core. The original GTX 260 graphics processor (GPU) had 8 TPCs, (24 x 8 = 192 SPs), the updated core will have 9 TPCs, that amounts to an additional 24 shader processors, which should increase the core's shader compute power significantly over merely increasing frequencies. It is unclear at this point as to what the resulting product would be called.

    Everything else remains the same with frequencies, memory size, memory bus width. This upgrade could take shape by this September.

    Source: Expreview
     
    Last edited: Aug 21, 2008
  2. jbunch07

    jbunch07 New Member

    Joined:
    Feb 22, 2008
    Messages:
    5,261 (2.19/day)
    Thanks Received:
    614
    Location:
    Chattanooga,TN
    so where the shaders already there just not activated?
     
  3. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,553 (11.24/day)
    Thanks Received:
    13,644
    Location:
    Hyderabad, India
  4. EastCoasthandle

    EastCoasthandle New Member

    Joined:
    Apr 21, 2005
    Messages:
    6,889 (2.00/day)
    Thanks Received:
    1,505
    People who already own 4800 series or 200 series card now can pretty much max out all settings with 4xAA/16xAF and, in most cases I believe it's over 50 FPS however, it depends on their native resolution. So IMO I don't see this as being a popular card. Besides, I couldn't imagine those who just purchased a 4800 series or 200 series buying this. And, I would be curious to know if people with 260s could actually stepup. If they can I would imagine the bulk of sales would come from that IMO.
     
    Last edited: Aug 21, 2008
  5. jbunch07

    jbunch07 New Member

    Joined:
    Feb 22, 2008
    Messages:
    5,261 (2.19/day)
    Thanks Received:
    614
    Location:
    Chattanooga,TN
    i see that the gtx280 and gtx260 have the same transistor count.
     
  6. EastCoasthandle

    EastCoasthandle New Member

    Joined:
    Apr 21, 2005
    Messages:
    6,889 (2.00/day)
    Thanks Received:
    1,505
    Are you saying that the 280 and 260 are essential the same? They disabled some features on the 280 and called it a 260. Then later re-enabled some features claiming that they added 24 shaders on the other 260?
     
  7. jbunch07

    jbunch07 New Member

    Joined:
    Feb 22, 2008
    Messages:
    5,261 (2.19/day)
    Thanks Received:
    614
    Location:
    Chattanooga,TN
    thats what it looks like to me.
     
  8. kyle2020 Guest

    that'll be a knife in Nvidias back if too many people notice that.
     
  9. EastCoasthandle

    EastCoasthandle New Member

    Joined:
    Apr 21, 2005
    Messages:
    6,889 (2.00/day)
    Thanks Received:
    1,505
    This will be very interesting if it turns out to be true. I can only guess that some 260 owners wouldn't like this (if they couldn't setup).
     
  10. candle_86 New Member

    Joined:
    Dec 28, 2006
    Messages:
    3,916 (1.39/day)
    Thanks Received:
    233
    thats what Nvidia always does people. How do you think the 8800GTS 640 112 was made?
     
  11. Sasqui

    Sasqui

    Joined:
    Dec 6, 2005
    Messages:
    7,644 (2.38/day)
    Thanks Received:
    1,402
    Location:
    Manchester, NH
    I'd be pissed if I had paid for a 260 already!
     
  12. jbunch07

    jbunch07 New Member

    Joined:
    Feb 22, 2008
    Messages:
    5,261 (2.19/day)
    Thanks Received:
    614
    Location:
    Chattanooga,TN
    well i dont think it would be the first time something like this has happened.
     
  13. Darkrealms

    Joined:
    Feb 26, 2007
    Messages:
    851 (0.31/day)
    Thanks Received:
    23
    Location:
    USA
    Go figure I just ordered a 260 yesterday. Oh well.
    Thanks for the info BTA.
     
  14. Kursah

    Kursah

    Joined:
    Oct 15, 2006
    Messages:
    7,774 (2.68/day)
    Thanks Received:
    1,659
    Location:
    Missoula, MT, USA
    Yeah, that's what I was wondering too...but how they've disabled the shaders/cores is what I've been curious about...there's no way to get 1GB of memory tho since the extra chip(s) are missing, but if a GTX260 could get the same ammount of shaders as a GTX280, I wouldn't complain!

    Though I'd rather see the 200b's released in 260 and 280 flavors sooner than later instead of a shader increase imo..but either way some extra performance wouldn't hurt, especially if prices stay similar or decline to keep things competetive. I may have a couple step-up options from EVGA coming soon then!

    :toast:
     
  15. kyle2020 Guest

    so would I. Seems like companys enjoy doing this sort of thing, and the faithful buers will always buy from them. Heads out the sand people! Brand loyalty is out the window! :D
     
  16. EastCoasthandle

    EastCoasthandle New Member

    Joined:
    Apr 21, 2005
    Messages:
    6,889 (2.00/day)
    Thanks Received:
    1,505
    True but during the G80 era there was no real competition. So it flew under the radar as an acceptable practice.


    Well, that's the whole point of being indoctrinated...I think...
     
  17. candle_86 New Member

    Joined:
    Dec 28, 2006
    Messages:
    3,916 (1.39/day)
    Thanks Received:
    233
    You buy on release you get burned. How do you think 8800GTS users felt when the 8800GTS with 112 shaders instead of 96 popped up?
     
  18. jbunch07

    jbunch07 New Member

    Joined:
    Feb 22, 2008
    Messages:
    5,261 (2.19/day)
    Thanks Received:
    614
    Location:
    Chattanooga,TN
    I just wanna know what is involved in enabling those extra shaders? if someone could do it themselves?
     
  19. Kursah

    Kursah

    Joined:
    Oct 15, 2006
    Messages:
    7,774 (2.68/day)
    Thanks Received:
    1,659
    Location:
    Missoula, MT, USA
    We'll find out when they show up! If it's a simple BIOS tweak or a change in GTX260 fab process, or what the deal is...too bad it's not just a driver tweak! I suppose it could be...but doubtfully.

    I wonder if they'll run the GTX260's at 1.18v to maintain stability with more shaders or if they can keep them at 1.12v. My GTX runs nice and cool overall..even OC'd I hit 65C load at 1.12v, I think the hightest I hit at 1.18v was around 73C.
     
  20. jbunch07

    jbunch07 New Member

    Joined:
    Feb 22, 2008
    Messages:
    5,261 (2.19/day)
    Thanks Received:
    614
    Location:
    Chattanooga,TN
    hmm i guess we will just have to wait and see what all involved.
    unless someone on here knows the answer please stand up. :rockout:
     
  21. candle_86 New Member

    Joined:
    Dec 28, 2006
    Messages:
    3,916 (1.39/day)
    Thanks Received:
    233
    prolly more complex, they where most likly laser cut
     
    1c3d0g says thanks.
  22. wolf2009 Guest

    I'm surprised how many people dont know this. But i read somewhere that the chips that Nvidia manufactures that have defects, they get put into cards like G80 8800GTS and GTX260 . The defective shaders or something like that is "disabled" . The perfect chips go into GTX280 and 8800gtx . This saves Nvidia money.
     
  23. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    19,935 (6.18/day)
    Thanks Received:
    6,027
    Not really surprising, and everyone needs to realize this has been a common practice in computer industry for decades. The processor manufactures do it, and so do the video card manufactures. ATi and nVidia have been cutting down cores to make lower cards for a very long time, so don't get in a big huff about it now.

    Though I hope nVidia doesn't keep the GTX260 name, I would prefer GTX270 or GTX265 to keep confusion down.

    Yep, exactly. And the G92's that are defective get put in 8800GT's and 8800GS's(9600GSO's). Ati didn't do it with the RV670, but they did with RV600. The 2900GT was just a defective RV600 with the defective shaders turned off.

    Intel and AMD use similar techniques with their processors. The original E6300 and E6400 were just defective Conroe cores that had the defective parts of the L2 cache disabled. Same thing with the Celerons and Penium E2000 series, they are just Allendale cores(from the E4000 series) with the defective parts of the L2 cache disabled. The Celeron 400 series are Allendale cores with an entire processing core disabled to give the appearence of only having a single core processor.

    AMD does this too, some of their single core processors are really dual-core processor with a defective core turned off, they started doing this at the end of the Socket 939 era.

    No, you can't do it yourselves, nVidia(and ATi) long stopped this by physically breaking the connection on the die itself.
     
    Last edited: Aug 21, 2008
    Crunching for Team TPU 50 Million points folded for TPU
  24. jbunch07

    jbunch07 New Member

    Joined:
    Feb 22, 2008
    Messages:
    5,261 (2.19/day)
    Thanks Received:
    614
    Location:
    Chattanooga,TN
    i got some lasers! ;)
     
  25. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,553 (11.24/day)
    Thanks Received:
    13,644
    Location:
    Hyderabad, India
    not "defective", just the ones that happen to perform lower when binning compared what's required to make it to a GTX 280.
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page