Saturday, June 28th 2008

NVIDIA in a Rush for 55nm Parts, has Partners 'Red-Eyed'

With the cost of manufacture for a standard G200 die reaching up to US $110, thanks to yields as low as 40 per cent, NVIDIA seems to be in a rush for a 55nm revamp of its current GPUs. While nothing revolutionary is on the cards, and with 55nm G92b already in the making, NVIDIA plans to revamp its G200 graphics processors to the 55nm fab process, increasing yields up to 50 per cent. At 55nm, the G200 die will be effectively reduced to 470 sq. mm, implies 120 dice on a 300mm wafer.

The pace at which things are moving is having the partners red-eyed. NVIDIA's new Unilateral Minimum Advertised Price Policy (UMAP) has limited partners' playing field and minimizes competition between them. When NVIDIA at the same time decides to launch new cards based on existing cores, at lower prices, partners get upset over diminishing earnings. Add to that AMD's new RV770 chip is looking very tempting to some of these partners.Source: NordicHardware
Add your own comment

33 Comments on NVIDIA in a Rush for 55nm Parts, has Partners 'Red-Eyed'

#1
yogurt_21
candle_86 said:
the point is the largest segment of graphics card sales is under 100 bucks your forgetting that bta.
where the 3850 is likely to dominate. is already tops wiz's price/perf charts. though to be true ati is about to launch lower 4000series cards which will change that bracket. and their specs don't look anywhere close to those of the 3850.
Posted on Reply
#2
candle_86
the 3850 looses to the 9600GSO don't forget that. Also for gaming on a super tight budget an 8500GT or 8600GT is a nice card also. I can't see Nvidia being to hurt even in the FX days they had more market share simply because of marketing and feature wise the FX5200 looked better than the 9200, even if it was slower. Nvidia marketing and the price/preformance of the lower end G92/G94/G84/G86 will keep Nvidia in the game even if GT200 doesnt do well
Posted on Reply
#3
btarunr
Editor & Senior Moderator
Yeah, but the revised pricing throws HD3850 back into the $80 ~ $110 range.
Posted on Reply
#5
newconroer
Megasty said:
It's terrible as it is but NV has to find out that they can't continue to sell their junk for so much. When the g200b GTX280 or whatever comes out it'll probably cost just as much as the 65nm GTX280. The performance will be improved but it will still cost too much. Mix that in with UMAP & I can see why distros have a problem with them. When it comes down to it, do ppl realy buy overpriced, undervalued cards, especially when there's something out there that costs $100 less & beats it. An e-penis can only get so big.
I guess you still either don't get it, or won't admit it, that the 280 offers something that no other card on the market can (which will probably include the 4870X2), and that is, virtually no texture loading stuttering, and consistenty solid minimum frame rates.

It's the first GPU to actually make proper 100% real time use of it's 1GB texture RAM.

For too long, people have had components in their systems that are top of the line, high end, and in some cases overkill for the applications they are attempting to run. Yet the GPU is the thing that holds them back, with it's hitching, bottoming-out RAMDAC, and instabilities.

The 4870 might be on par with top-end potential frame rates, but 80 fps is not what I need to 'get my game on.' I don't need 60, 50 or even 40 frames for that matter. What I need, what we all need, is a card they lets us stay consistent as much as possible. To take whatever the game throws at it, chew it up, and spit it back out while grinning the whole time.


Nvidia knows that consumers in 'the know,' will pay the premium price just for this aspect alone.

The problem is, that the general consumer isn't aware of these issues; or rather they notice them, yet cannot define or comprehend them - thus not being able to realise that the GTX 280 resolves said issues.

THAT part of it, will hurt Nvidia, yet again, with the amount of money and resources they have, they probably don't care.

They'll drag out the price on the 280 as long as they can, and for good reason.

Which leads us back to square one : People on a budget need to stop bitching that they can't have high-end products at a non-budget price.

On a lighter note, consider that there's plenty of cards already on the market (pre 4800/200) that can handle a lot of 3d applications without a problem; and now, they're becoming cheaper.
If anything, rejoice that your budget can accomodate a nice piece of hardware, and let those people who are fortunate enough to have lots of money, be able to enjoy the 'best of the best,'
rather than crying and bitching about it.

Life.... deal with it.
Posted on Reply
#6
Airbrushkid
I'm not poor so I'll stay with Nvidia. :toast:

lepra24 said:
I'll always stay with best prices:nutkick:
Posted on Reply
#7
imperialreign
newconroer said:
I guess you still either don't get it, or won't admit it, that the 280 offers something that no other card on the market can (which will probably include the 4870X2), and that is, virtually no texture loading stuttering, and consistenty solid minimum frame rates.

It's the first GPU to actually make proper 100% real time use of it's 1GB texture RAM.

For too long, people have had components in their systems that are top of the line, high end, and in some cases overkill for the applications they are attempting to run. Yet the GPU is the thing that holds them back, with it's hitching, bottoming-out RAMDAC, and instabilities.

The 4870 might be on par with top-end potential frame rates, but 80 fps is not what I need to 'get my game on.' I don't need 60, 50 or even 40 frames for that matter. What I need, what we all need, is a card they lets us stay consistent as much as possible. To take whatever the game throws at it, chew it up, and spit it back out while grinning the whole time.


Nvidia knows that consumers in 'the know,' will pay the premium price just for this aspect alone.

The problem is, that the general consumer isn't aware of these issues; or rather they notice them, yet cannot define or comprehend them - thus not being able to realise that the GTX 280 resolves said issues.

THAT part of it, will hurt Nvidia, yet again, with the amount of money and resources they have, they probably don't care.

They'll drag out the price on the 280 as long as they can, and for good reason.

Which leads us back to square one : People on a budget need to stop bitching that they can't have high-end products at a non-budget price.

On a lighter note, consider that there's plenty of cards already on the market (pre 4800/200) that can handle a lot of 3d applications without a problem; and now, they're becoming cheaper.
If anything, rejoice that your budget can accomodate a nice piece of hardware, and let those people who are fortunate enough to have lots of money, be able to enjoy the 'best of the best,'
rather than crying and bitching about it.

Life.... deal with it.
That's all fine and good, and I understand you're point to the minimum FPS rates . . .

but has anyone really stopped to think how much the system itself adds to this issue moreso than the VGA apdater itself?

Sure, it might take a 3870 1GB a little longer to load textures into VRAM, but sitting on a PCIE2.0 BUS decreases load times as well; but if you're trying to load up textures with a P4 sitting in the CPU throne, that in itself will drastically lengthen how long it takes for the VGA adapter to be able to load everything up. You're dealing with a massively bottlenecked system, a bottlenecked BUS, pathetic L1/L2 caches . . . all that can seriously anchor minimum FPS by extended load times.

Just in comparison with my system, I experience very little load-up stuttering and even then, the amount of time is minimal (except in the case of Crysis). Does it detract at all from the gaming experience? Not in the least, seeing as how it's only an issue once a game level is loaded up . . . once the game itself is off an running, I don't notice any other issues until you hit that one spot where it needs to swap in/out textures, and unless that 0.5s pause is going to get one's panties in a bunch, I find it nothing to complain about at all.
Posted on Reply
#8
Nick89
Airbrushkid said:
I'm not poor so I'll stay with Nvidia. :toast:
How about you G*F*Y*S*?


Sorry mods
Posted on Reply