Originally Posted by phanbuey
Makes you wonder why they sat around on their butts during the g80/g92 reign... This could have been out along time ago.
If by that you mean why they didn't release anything new it's because they really had not many options. Without serious competition for their top cards and with G92 cards being capable of playing everything at highest settings, any card would be just seen as overkill. Withour Ati cards being on par, developers would never really take advantage of the extra power of the new cards. It happened to G80 to some extent and back then performance was more required than in G92 days.
Later many factors leaded to GT200's "failure" (I don't think it's a failure at all, as long as you see it as something more than a GPU).
First of all, Ati played really well with RV770, including not revealing the true specs until very late (many partners still said 480SP in their sites even after the card was released, until the NDA was lifted!!), thus negating any effective response from Nvidia. When you are on the lead to the extent Nvidia was back then you have to try to match competitor's performance as much as possible. Being much faster won't help you at all, because of what I said above about developers: you would increase costs with little to no perceived advantage for the masses.
Not using GDDR5 was not really an error IMO, them not using it was not because they were lazy, sitting on their butts. GDDR5 has been expensive and scarce, and manufacturers have been struggling to meet the demand. Nvidia was selling twice as much as Ati when GT200 was concieved and also when it was released. That means that if they wanted to mantain that rate (and they had to...), the number of graphics cards that would have been shipped with GDDR5 would have been 3x the amount of the ones (HD4870 and X2) that have been shiped in reality (even today Nvidia leads at 30%, while Ati is at 20% so take that into account too). Manufacturers simply wouldn't have been able to meet that amount and prices would have been much much higher to the point of reaching obscene numbers. I don't have to say that would be bad for everybody, except memory manufacturers. Especially the end user.
Additionally 512bit memory bus is very beneficial (almost a must IMO) for CUDA and Nvidia decided to bet hard on their GPGPU solution. And although CUDA is still not widely used, and although CUDA support is one of the things that made GT200 so big and expensive I think it was justified. CUDA simply works, I love it if only for the encoding capabilities. Badaboom is far from being perfected and it's already like the Godsend for me: I usually encode 20+ videos per day (mp4, yeah I was lucky here
) and that usually took me 2-3 hours on my previous CPU (X2 4800+) and 1-2 with the Quad. With Badaboom and my 8800GT I can do it on 20-30 minutes so it's simply amazing. I can only see it getting much better in the future with GT200 and above cards (specifically designed for CUDA).
The only thing that Nvidia should have changed in the first place is the manufacturing process. That was what made GT200 so expensive. Anyway I think that was not due to the process itself, was not inherent to it, but just a failed implementation at launch. IMO the yield problems are more than fixed right now and that there's more than competition needs behind the massive price cuts. What I mean is that there's a strong "we can" component along with the "we need to" in the formula for the price cuts.
All in all, IMHO Nvidia has been doing a lot of good things in the meanwhile, rather than sit down on their butts. They have devised that graphics is not all and want t follow that route. It just happens that aiming at more than graphics makes you weak on graphics only applications from a performance/price point of view, but it's a step you have to take if you really want to submerge into new waters. Some people might apreciate it and some won't, but IMO there's no doubt about the value of CUDA and PhyX.
Originally Posted by newconroer
Maybe because staying ahead of the competition isn't worth the costs or the resources, when there's no real demand in the market, for them to do so.
With exception to Crysis, no game required the caliber of the R700 or the GT200 up until a few titles recently.
So what would be the point?
Exactly. You beat me to it, although I wanted to elaborate much more my reply. And yeah I know it's boring to read me.