• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA to Upgrade GeForce GTX 260 with 24 Additional Shaders

I've considered flashing my 260 with a 280 bios, but with the missing memory chips and most likely hardware disabled GPU sections...even if it did work you'd have an overvolted and overclocked GTX260. But maybe someone out there will try it and find out something more positive? Won't be me! I'll flash my 260, but only with a 260 bios! :D
 
Not really surprising, and everyone needs to realize this has been a common practice in computer industry for decades. The processor manufactures do it, and so do the video card manufactures. ATi and nVidia have been cutting down cores to make lower cards for a very long time, so don't get in a big huff about it now.

Though I hope nVidia doesn't keep the GTX260 name, I would prefer GTX270 or GTX265 to keep confusion down.



Yep, exactly. And the G92's that are defective get put in 8800GT's and 8800GS's(9600GSO's). Ati didn't do it with the RV670, but they did with RV600. The 2900GT was just a defective RV600 with the defective shaders turned off.

Intel and AMD use similar techniques with their processors. The original E6300 and E6400 were just defective Conroe cores that had the defective parts of the L2 cache disabled. Same thing with the Celerons and Penium E2000 series, they are just Allendale cores(from the E4000 series) with the defective parts of the L2 cache disabled. The Celeron 400 series are Allendale cores with an entire processing core disabled to give the appearence of only having a single core processor.

AMD does this too, some of their single core processors are really dual-core processor with a defective core turned off, they started doing this at the end of the Socket 939 era.



No, you can't do it yourselves, nVidia(and ATi) long stopped this by physically breaking the connection on the die itself.


i thought all that was "common" knowledge? people didn't know that? i always just assumed that all the celerons/semprons where f'ed up allandale/athlon's with cache/cores disabled
 
By the time GTX 280 Rev. 2 comes out, it will be too late. the HD 4870 X2 will already be lower than the newest revision of the GTX 200 series.
 
what a joke! is this all nvidia has??? once the 4850x2 hits the street it will really be over this round.
 
gotta love competition, you Green Team Members should be very Happy that Red Team is giving them competition.
 
gotta love competition, you Green Team Members should be very Happy that Red Team is giving them competition.

exactly! with out competition the 8800gtx would still be their flagship card.
 
Question: Would I be able to SLI a plain GTX 260 with this one (GTX 265?)

probably not without some sort of modding... and why would you want to?? - the slowest card determines the speed in SLI, so you would be basically SLI'ing two normal 260's but paying more for the fancier version.

I was gonna buy another 260 this week to sli, but i figure ill wait till their price goes through the floor with this announcement. If you OC 260 to about 725 Core and 1450 shaders, it can play on the same level as a stock 280, if you sli 2 OC'd 260's youre probably looking at the performance of a stock GT300 (384 shaders etc)... that's plenty enough juice, even for Crysis at very high.

Im actually kind of happy this is coming out... means i get to buy a cheapo 260. :D
 
Does anyone here remember when ATI's X1800GTO could be unlocked to an X1800XL? X1800GTO had 12 pipes, while the XL had 16. All that was needed was a bio update. Some cards had an extra VRM allowing you to do this, while some didn't and were laser locked. Same situation I think...
 
Decreasing the shader count on a lower-end card is nothing new. They never actually produce different chips for the two highest ends cards. If a 200 is produced and one or two of the clusters are dysfunctional then they can just disable them and use the chip as a GTX 260, the same goes for chips that can't stay stable at 280 speeds, even if all the shaders work.

However, as more 260s sell than 280s (this always happens with cheaper cards) nVidia must use some perfectly functional cores in the 260. This has been going on for generations with both card makers. Before they were laser locked these chips could be unlocked and run at full speed. Sometimes people got cores that actually did have dysfunctional parts and the bios change never worked for them, it was chance.
 
probably not without some sort of modding... and why would you want to?? - the slowest card determines the speed in SLI, so you would be basically SLI'ing two normal 260's but paying more for the fancier version.

I was gonna buy another 260 this week to sli, but i figure ill wait till their price goes through the floor with this announcement. If you OC 260 to about 725 Core and 1450 shaders, it can play on the same level as a stock 280, if you sli 2 OC'd 260's youre probably looking at the performance of a stock GT300 (384 shaders etc)... that's plenty enough juice, even for Crysis at very high.

Im actually kind of happy this is coming out... means i get to buy a cheapo 260. :D

When nVidia released the 8800GTS 640mb with 112 SP's instead of the 96, you could still SLI both together. I don't know if the one with more SP's made use of the extra shaders when in SLI though, I think you are right in saying they wouldn't.

Also, when they did this with the 8800GTS, they released the new card at the same price point as the old 8800GTS, and discontinued the previous card, so the prices didn't actually go down for the old cards.
 
no it doesnt, a 112 and a 96 makes both a 96
 
still far from 4870 , and by the way guys anyone have link for 4870 with 1g i need to know what price of it too
 
Back
Top