Monday, September 15th 2008

New GeForce GTX 260 Could Lead to Overstock of Older GTX 260

Bad news for the manufacturers, possible good news for consumers. NVIDIA revised the GeForce GTX 260 (D10U-102) graphics processor (GPU) in an attempt to stamp performance superiority over the Radeon HD 4870. While from a technical standpoint there are mixed views about this move, with some suggesting the performance leads aren't significant over its older version, while others finding it a good move with potential for more performance gains with tweaks and overclocking, it cannot be denied that for the new GTX 260 to fit into the market, there are some tough maths are at play.

The new GPU will certainly affect the prices of products across the segment, including those of the older GTX 260. If the GTX 260 gets a significantly lower price compared to the newer version or if this gets a significantly higher price, it could affect the sales of either products, and profits in general for manufacturers. Concerns over the new GPU causing over-stocking of the older GPU-based products loom at large, reports industry observer DigiTimes. Overstock is a condition where demand for a product is much lower or on a decline in relation to the supply. If such a situation arises where the D10U-102 sells much better than the older GTX 260 which is in good stock, manufacturers could be forced to sell the older cards at lower prices, as that is a common reaction to overstock commodities. Expect great prices caused not due to inter-brand competition, but intra-brand competition.
Source: DigiTimes
Add your own comment

36 Comments on New GeForce GTX 260 Could Lead to Overstock of Older GTX 260

#1
[I.R.A]_FBi
nvidia, make the damn cards smaller so i can fit one!
Posted on Reply
#2
Unregistered
[I.R.A]_FBinvidia, make the damn cards smaller so i can fit one!
agreed
Posted on Edit | Reply
#3
[I.R.A]_FBi
if i saw one for 179 id prolly cop if i could fit it
Posted on Reply
#4
CrAsHnBuRnXp
If I could get one for about 175 id get one.
Posted on Reply
#5
Darkrealms
EVGA needs to release this model. I need a second GTX260 and if the prices get that low, I'll get a second one!
Posted on Reply
#6
Octavean
Yeah, if the older GTX 260 had a significant price drop soon I would probably buy one for my upcoming Intel Nehalem X58 build. Otherwise I would just go with a spare 8800 GTS 640MB.
Posted on Reply
#7
CrAsHnBuRnXp
I dont know if I would want the one with the more shaders or the lesser?
Posted on Reply
#8
candle_86
doesn't make a big diffrence
Posted on Reply
#9
Kursah
What I'm interested in more than the extra 24 shaders in comparison to my card is the overclocking abilities of the card, the heat output and if it needs more voltage or not to compensate for the extra shaders.

I won't mind stepping up to this if the the new GTX260 if it proves to have similar OC results...we still have to wait and see if the 180 series of drivers will extract more performance out of these and previous generation cards, I believe they will...just don't know if it'll be very much.

I was hoping this one would also be 55nm, but that's gonna happen outside of the end of my 90-day step-up program, so I'm not gonna worry too much about it. But these could still be promising in the future, may not seem like much now, but it may make a difference in future games, in the end it's still 24 more shaders, get them up to 1500-1600MHz and they'll do pretty good. Too bad GTX's don't have higher speed shaders yet...I bet it happens with 55nm tho, I'd guess 1700-1800 shader speed tops tho, maybe after monster OC's.

:toast:
Posted on Reply
#10
Darkrealms
candle_86doesn't make a big diffrence
PFFF... yes it does. The old one will be cheaper now ; )
Posted on Reply
#11
soldier242
KursahWhat I'm interested in more than the extra 24 shaders in comparison to my card is the overclocking abilities of the card, the heat output and if it needs more voltage or not to compensate for the extra shaders.

I won't mind stepping up to this if the the new GTX260 if it proves to have similar OC results...we still have to wait and see if the 180 series of drivers will extract more performance out of these and previous generation cards, I believe they will...just don't know if it'll be very much.
same thoughts here, i think i'll sell my gtx 260 soon and wait for the nu one
Posted on Reply
#12
Skywalker12345
yea nvidia has huge cards so does ati, we need to make smaller more powerfull gpus that dont take 500W each
Posted on Reply
#13
paybackdaman
This is exactly what happened with the older 8800GTS. it went from 96 shaders to 112. I wonder if NVIDIA learned anything from this? :slap:
Posted on Reply
#14
..'Ant'..
It better cost less than the older GTX 260.
Posted on Reply
#15
Tatsumaru
Hey Guys can someone help me out here. I am using the Quad6600 on 3.2 GHZ
If i am to switch from my 9800GTX to this new GTX260 with more shaders,
just how much firepower will gain on 1680-1050 with all the eye candy settings ON.
30%?? or maybe 50%??
And by the way i am agree with some of you here, NVIDIA should make those DAMN good Videocards SMALLER !!!! I could barely fit a 9800GTX.
Posted on Reply
#16
Selene
I would think 25-30% boost in most games.
Posted on Reply
#17
Silverel
They should really change the model name to GTX265 or something. Leaving it be will just end up in confusion for the less informed...

O wait, that seems to be their plan most of the time. It'll give them an excuse to drop prices on the old models, and spout off about how much faster they are than the competition. Hence, the uninformed will run out and buy a GTX260... but wait, it's an old one.

Genius!
Posted on Reply
#18
Scrizz
who cares if it's old, it's cheaper
Posted on Reply
#19
xfire
paybackdamanThis is exactly what happened with the older 8800GTS. it went from 96 shaders to 112. I wonder if NVIDIA learned anything from this? :slap:
Nvidia isn't going to learn anything. They don't have the losses here it's the card manufacturers who stocked up the older 260 chips.
Posted on Reply
#20
Silverel
Scrizzwho cares if it's old, it's cheaper
Right, but not faster! Seems they'd be setting people up for disappointment... of sorts. :laugh: Not like having a GTX260 could even BE disappointing considering the overall performance. Just might not be up to expectations.
Posted on Reply
#21
SK-1
paybackdamanThis is exactly what happened with the older 8800GTS. it went from 96 shaders to 112. I wonder if NVIDIA learned anything from this? :slap:
I know I learned a lesson,..a 8800 96 that does 700+core will out do the 8800 112 that I could only get to 660 core:o
Posted on Reply
#22
TheGuruStud
SK-1I know I learned a lesson,..a 8800 96 that does 700+core will out do the 8800 112 that I could only get to 660 core:o
Must've got a bad g92, I thought 720 or more was the norm.
Posted on Reply
#23
Zubasa
paybackdamanThis is exactly what happened with the older 8800GTS. it went from 96 shaders to 112. I wonder if NVIDIA learned anything from this? :slap:
Also the fact that nVidia produced what? Four cards named 8800GTS?:banghead:
Posted on Reply
#24
DaedalusHelios
You are only hurt by the naming scheme if you are an uninformed customer. We aren't uninformed..... so why are we complaining if we understand the difference?

It didn't confuse me atleast. :laugh:
Posted on Reply
#25
Silverel
DaedalusHeliosYou are only hurt by the naming scheme if you are an uninformed customer. We aren't uninformed..... so why are we complaining if we understand the difference?

It didn't confuse me atleast. :laugh:
I like to advocate for the uninformed. Too many times have I gone into a BB or CC and heard their sales guys pushing crap into the hands of their customers. I take every chance I get to pull them aside before they pay for stuff and let them in on the real deal. There's no complaints for us, we all know the difference. However, there's tons of people that stop by the site asking questions just like this. Gamers that don't follow tech, and just want the best deal. The shouldn't be ripped off over and over again. Changing the name to reflect architectural differences is something that MOST companies do without a second thought. Why nVidia refuses to is just disgusting to me... :shadedshu
Posted on Reply
Add your own comment
Apr 28th, 2024 09:48 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts