Monday, May 19th 2008

NVIDIA Increases Foundry Outsourcing to TSMC and UMC

According to the Taiwanese Economic News site, NVIDIA is planning to increase production in Taiwan Semiconductor Manufacturing (TSMC) and United Microelectronics (UMC), in order to keep up with the strong demand for its graphics chips. Throughout the last quarter NVIDIA had contracted TSMC to make a record 50,000 wafers of 65nm chips and UMC to make 7,000-9,000 wafers of the same chips. Now industry watchers forecast TSMC will produce up to 60,000 wafers for NVIDIA, while the volume production of UMC will rise to 10,000-12,000 wafers over the next quarter. With the strong demand for NVIDIA cards growing, the company's executives estimate to beat the record revenue of $935.3 million achieved for the same period of last year.Source: CENS.com
Add your own comment

32 Comments on NVIDIA Increases Foundry Outsourcing to TSMC and UMC

#1
btarunr
Editor & Senior Moderator
Chip wars don't affect TSMC, even ATI gets its wafers made there.

Go NV.
Posted on Reply
#2
FreedomEclipse
Crazy Dogmatic Bullsh!t!
wow TSMC are getting loads of work - I didnt know about ATi but i was aware of an earlier post stating AMD outsauced to TSMC also...

lets hope the build quality doesnt go below standards I hate buying sh!t that came from '3rd party' groups that dont live up to the originals standard.
Posted on Reply
#3
candle_86
actully if Nvidia is using up most of there resources or a good part, Nvidia had a contract there first, AMD 2nd, so AMD would have to take a back seat and wait on the Nvidia cores to be done woot
Posted on Reply
#4
btarunr
Editor & Senior Moderator
by: candle_86
actully if Nvidia is using up most of there resources or a good part, Nvidia had a contract there first, AMD 2nd, so AMD would have to take a back seat and wait on the Nvidia cores to be done woot
Though I don't know how it works, it's unlikely that they serve their customers like the local crowded breakfast shack would. They could do things parallel. X-Mas '07 saw HD3870 in plenty while 8800 GT went out-of-stock in most places, could also be because they didn't want to produce it in a large scale then. After it was obvious there was high demand for it, we saw news such as "NV suggests retailers to use 6layer PCB's" all of which suggesting that they were going to produce it on a large scale. Now there's plenty of 8800 GT everywhere.
Posted on Reply
#5
Millenia
by: candle_86
actully if Nvidia is using up most of there resources or a good part, Nvidia had a contract there first, AMD 2nd, so AMD would have to take a back seat and wait on the Nvidia cores to be done woot
Boy, you REALLY hate AMD, don't you? :laugh:
Posted on Reply
#6
candle_86
didn't untill they bought Hell aka ATI, now they sleep with the devil and we rely on Nvidia, the video card Christ uses in heaven to win.
Posted on Reply
#7
Wile E
Power User
What you don't realize is, if AMD goes down, that spells the end for us consumers. Not only will prices go up, but nVidia and Intel would have no reason to devote so much time and money into R&D, drastically slowing forward progress.

Hoping any part of AMD goes down is just fanboy ignorance.
Posted on Reply
#8
Rebo&Zooty
by: Wile E
What you don't realize is, if AMD goes down, that spells the end for us consumers. Not only will prices go up, but nVidia and Intel would have no reason to devote so much time and money into R&D, drastically slowing forward progress.

Hoping any part of AMD goes down is just fanboy ignorance.
what do you expect from candle_86? every post i see him make is negitive about amd/ati and wishing them ill.......stupid part is that if they die, as you said so does the market for people like us, back to the early days when it cost 6k for a shitty computer........
Posted on Reply
#9
Rebo&Zooty
by: btarunr
Though I don't know how it works, it's unlikely that they serve their customers like the local crowded breakfast shack would. They could do things parallel. X-Mas '07 saw HD3870 in plenty while 8800 GT went out-of-stock in most places, could also be because they didn't want to produce it in a large scale then. After it was obvious there was high demand for it, we saw news such as "NV suggests retailers to use 6layer PCB's" all of which suggesting that they were going to produce it on a large scale. Now there's plenty of 8800 GT everywhere.
hes just a totaly insain fanboi, i bet he would be the first one crying if amdti died and he wasnt able to get decent hardware for a reasonable price.....

if any of the current company go down, say good bye to innovation, say hello to paying 300bucks for a 8400/3400, or 3600+/e21*0 cpu, and a nice preimum on the board as well.
Posted on Reply
#10
Millenia
by: candle_86
didn't untill they bought Hell aka ATI, now they sleep with the devil and we rely on Nvidia, the video card Christ uses in heaven to win.
I personally see it completely the opposite way but meh, your call :p
Posted on Reply
#11
erocker
by: candle_86
didn't untill they bought Hell aka ATI, now they sleep with the devil and we rely on Nvidia, the video card Christ uses in heaven to win.
That's some of the worst garbage I've ever heard.:shadedshu


Stay on topic please...
Posted on Reply
#12
HTC
by: Wile E
What you don't realize is, if AMD goes down, that spells the end for us consumers. Not only will prices go up, but nVidia and Intel would have no reason to devote so much time and money into R&D, drastically slowing forward progress.

Hoping any part of AMD goes down is just fanboy ignorance.
And if you need an example of this, look @ Blue Ray prices now as opposed to when HD-DVD was "still in the race".

EDIT

If this nVidia move helps in the "friendly" war between AMD and nVidia, i'm all for it since it's the consumer that get's the benefits!
Posted on Reply
#13
candle_86
by: Rebo&Zooty
what do you expect from candle_86? every post i see him make is negitive about amd/ati and wishing them ill.......stupid part is that if they die, as you said so does the market for people like us, back to the early days when it cost 6k for a shitty computer........
I have every right to wish them ill, they bought ATI who is never on time, never releases a compleing product ect. The last compelling thing they made was the 9700pro. They claimed SM3 wasn't important screwing cusomers in 2005/06 still using the hardware to try to play games that are SM3 dependant and won't work on SM2 cards. They made HDR/AA exist yet its not playable, IMO unless you run at low res or like 30FPS, so its a useless feature on there part. Crossfire for the longest time was a waste of money and a poor mockup. The R600 core posses some of the worst design's to come out since there Rage 128. Yet they keep making either worthless features or ill planned cards that don't do what they claim so yes i wish them ill.
Posted on Reply
#14
Rebo&Zooty
duno, the x1900 range was still kicking your dear nvidias arse till the 8800gts/gtx came out, so your bias is bullshit.

and bullshit hdr+aa works great on my 1900xt/xtx card, farcry, oblivion and a slew of other games, hell my 8800gt buggers up if i try and run it with aa+hdr.......nvidias drivers have buggs hey refuse to spend the time to fix, look up the crash bug under server 2003 and x64xp, its a KNOWN BUGG since those os's came out, but nvidia hasnt fixed it, because they are to busy getting 2fps boost in crysis and a few 3dmarks so they can be king of the bench/review sites.....

as to the 600/670, i have seen a few reviews indocating that in NATIVE dx10 games/benches the r600 pulls ahead of the nvidia cards even with AA enabled, this is because ati designed the chips for dx10 and to dx10.x specs so the AA is shader based, on the other hand nvidia's 8800 line is NOT NATIVE DX10, its acctualy a dx9 card with dx10 shader support, the 8800/9800 use detocated hardware for AA as was needed for dx9 performance.

ATI's mistake here was not adding the hardware AA units AND in bliving that vista would take off and every gamer would move to it and that every game maker would move to dx10 because all the gamers moved to vista.

vista floped, and since it floped so did the dx10 native design of the r600/670 range of cards, well not really a flot on the 3800 cards since they are still selling very well really.

oh HDR+AA has NO PERFORMANCE IMPACT AT ALL on the x1900 range of cards, you can add hdr to aa or aa to hdr with no hit, google some reviews ;)

as to the rage128, the hardware was great, the drivers for 2k sucked, the 98drivers where decent, and gave the tnt/tnt2 line of cards a run for their $, mostly due to the fact that the rage128 was native 32bit so using 32bit mode didnt have the impact that it had on nvidias cards, i had BOTH lines, the worse was the rage128 maxx edition due to drivers never maturing for the dual chip card)

Nvidia has had its flops, look at the FX line, they all suck for dx9 stuff, dispite that being their main selling point, just crap utter crap.....again i know from personal experiance.

each company has screwed up.

examples on late to market or under supplyed items.

Radeon VE: cheap design with no hardware t&l, but they never implyed it had those.
x800 was avalable BUT short supply at first.
x1800: was very late to market but at least around here could be had at msrp with ease.
2900: late to market, used to much power, ran hot, poor dx9 aa performance, and poor avivo decoding support.
38*0: same poor aa perf as above, other problems fixed.
3870x2: little late to market, to big a hit when u crank aa up.

nvidia:
geforce1 sdr/ddr: late to market by 2-3 weeks, VERY short supply, it took me camping compusa to get one.
geforce2gts/ultra: see above, they where a couple weeks late to acctualy getting cards into stores, then the supplys where FAR to low for the demmand.

geforce4 4400/4800: damn neer impossable to find those 2 models, they wher emore marketing bs then a product you could acctualy get ahold of.

Geforce5/FX: marketed on time, ran find in OLD games, but as soon as you moved to dx9 games they fell on their faces due to VERY poor design, also many fx line cards ran VERY hot, the 5800ultra was the worst design i have seen pre 8800gt stock cooler.......

7800: was in short supply at least around here, had to order one and wait your turn to get one as they came in.
7900/7950: see above.

nvidias cards tend to come into stock more offten but in smaller numbers at least around here, so if they run out of nvidia cards you gotta camp a store to get one for sure, ATI cards tend to come in stock with reasonable stock in my exp, this is since the 9600 days, the x800 line was in short supply if you wanted a xt/xt pe card, but the pro vivo was avalable aplenty and most of them flashed into xt pe cards no problem at all!!!!

the fx line was horrible, i had a few of them, the 5700 was the only one that didnt totaly tank in dx9 and even its perf was at the best BLAH compared to the ati cards in the same price range.

as to shader3 vs shader2, as you would know if you wherent a fanboi, the x800/850 can do hdr, look at halflife2 lost coast and the 2 expantions, they use sm2 hdr and it looks damn good.(lost cost's was a bit..meh but it was just an early tech demo.)

http://en.wikipedia.org/wiki/High_dynamic_range_rendering
read up,
Graphics cards which support HDRR

This is a list of graphics cards that may or can support HDRR. It is implied that because the minimum requirement for HDR rendering is Shader Model 2.0 (or in this case DirectX 9), any graphics card that supports Shader Model 2.0 can do HDR rendering. However, HDRR may greatly impact the performance of the software using it; refer to your software's recommended specifications in order to find specifications for acceptable performance
sm3/fp16 hdr requiers x1k or 6seirse nvidia cards, BUT under the 6 seirse the performance hit of HDR is such that u gotta lower res to play, and NO card pre 8800 can do hdr+aa on the nvidia side, where as the 1800/1900 and any x1k card can do hdr+aa with no aditional perf hit when u combine them :)

http://www.firingsquad.com/hardware/hdr_aa_ati_radeon_x1k/
Conclusion



Up to this point, most gamers have pretty much come to accept that you can’t combine HDR with AA. A lot of this is because hardware capable of taking advantage of both features just hasn’t existed until just recently, but another significant factor which can’t be understated is the performance hit that’s traditionally come from enabling both features. After all, we all saw the huge performance hit that came from turning on HDR with Far Cry a few years ago, and more recently we saw it again in Oblivion with HDR, where today’s latest high-end cards ran with frame rates that were even slower than 4xAA once HDR was enabled. Based on all this evidence, who would have thought you could combine the two and still get pretty good performance? Certainly not us.

Until today that is.

Adding 2xAA/8xAF to Far Cry running with HDR had very little effect on the Radeon cards relatively speaking. At 1600x1200 the Radeon X1900 XTX’s performance drops by just 2 fps, or a little over 4%, while the X1800 XT 512MB sees an even slimmer 2% drop off. Even the slower Radeon X1800 GTO and X1900 GT cards see only slight declines once AA is added on top of HDR in Far Cry.

Under the greater demands of Oblivion, the margins are definitely greater, but we still saw manageable frame rates; adding AA to HDR actually comes free at 1024x768 for all cards except the Radeon X1800 GTO, and keep in mind that we could easily turn down the graphics settings a little for even better performance. In our outdoors testing the Radeon X1900 XTX saw a performance dropoff of nearly 30% while the Radeon X1800 XT took at performance hit of 21% once HDR+AA was enabled. Similarly, the Radeon X1900 GT took a greater hit than the GTO.

This is probably because the GPU on the older R520 cards is already pretty bottlenecked once HDR is running in Oblivion, once AA is added, the GPU can’t bottom out much further. In the case of the R580-based cards, they’re not quite as overtaxed with just HDR running, so once HDR+AA is enabled you see performance decline significantly.

The differences between running with HDR and HDR+AA aren’t quite as significant in foliage testing simply because the foliage area is more stressful on the graphics card than the outdoors area.

Looking over the results, HDR+AA is certainly a lot more feasible on ATI’s Radeon X1K cards than we initially thought. Getting playable frame rates with HDR+AA shouldn’t be too hard as long as you keep the eye candy in check, and with older games like Far Cry you should be able to turn it all on while also running HDR+AA without any problems.

Now if we can just see what kind of performance we can expect from Crysis and Unreal Tournament 2007 once HDR+AA is turned on. Unfortunately we’ll all have to wait a little longer to see those results…
as we all know, oblivion like crysis is poorly optimized(im being kind)

but even with the perf hit, at least it works and is playable, 7950 and lower CANT do aa+fp16hrd, just not possable, they can do the same HDR the 9700-x800-x850 cards can do combine with AA because thats fp12 based(used in halflife games)

blah i went on to long, but hey, you spew bullshit, i gotta counter it.
Posted on Reply
#15
HTC
Hell of a way to make an argument, Rebo&Zooty: well done!
Posted on Reply
#16
Rebo&Zooty
by: HTC
Hell of a way to make an argument, Rebo&Zooty: well done!
its my forte, Some places on the net i am known to make a post thats up to 1 page long if i feel something needs explained in detail, or if i feel that somebody needs slaped with FACTS because they are acting like a total fanboi.

every company has issues, and i use to hate ati, as have many others, but i got over that after i found that nvidia arse raped me on the high end fx card they sold me for dx9 gaming.
Posted on Reply
#17
candle_86
first off no idea where you live, i guess under a rock, but the none of those cards cept the x1800, x800 and Ti 4400 where in short stock here ever. the 4400 was in short supply for one reason no demad, priced to closely to the 4600 and the 4200 was priced a fair amount cheaper and could oc to that level. As for the FX no one says they where great but when they came out remember there wasn't a true DX9 game to begin with and Nvdia did really well untill FarCry showed up followed by HL2. As for HDR FP12 is a poor mans HDR, as for games that cant run on SM2 period look at SplinterCell Chaos Theroy and any other games of recent. My friend had to replace his x800xt with a 6800Ultra because he couldnt play SC:CT on it. But of corse to ATI SM3 isnt important at all. Oh one more thing FP12 HDR along with the HL2 HDR has more in common with bloom than HDR. The vibrance and color disparity along with contrast just isnt there. I refuse to play with HDR or bloom on the source engine because it looks like total crap. ATI losses agian and again. Want the real crap line of cards here ya go


Rage 1-128pro (Bad Drivers, Late to Market)
Radeon 70xx (Bad Drivers, Late to Market, Uncompetive)
Radeon 8500 (same as above, by the time they cauge the ti-500 the ti-4200 was at the same price point)
GeforceFX (poor DX9 preformace, great OpenGL and DX8 though)
Radeon x800 (NO SM3 support hurts the cards later)
Radeon x1800 (Late to Market, uncompelling, very limited supply)
Radeon HD2xxx/3xxx (Late to market for the 2xxx, Poor preformace, uncompelling)

there ya go the truth
Posted on Reply
#18
Wile E
Power User
"Uncompelling" isn't even an argument. It's not substantiated in any way. As for R600, poor performance? What rock do you live under? The R600 was a great performer, and matched very well with it's intended target, the 8800GTS G80. Although it is a power hog.

And I refuse to argue about the FX line or X1800 or whatever. That stuff is ancient history. Both companies have been guilty of releasing shitty products with poor support. Let it the hell go already. It's stupid to argue about, and completely stupid to use as the basis of an argument years later. Both of you just shut up.

Here is only important fact, if AMD goes down, like you wish candle, we will all lose as consumers.
Posted on Reply
#19
btarunr
Editor & Senior Moderator
R600 (HD2900 XT 512M, Visontek) has become a collectors' item here. $800 :laugh:
Posted on Reply
#20
DaedalusHelios
by: Wile E
"Uncompelling" isn't even an argument. It's not substantiated in any way. As for R600, poor performance? What rock do you live under? The R600 was a great performer, and matched very well with it's intended target, the 8800GTS G80. Although it is a power hog.

And I refuse to argue about the FX line or X1800 or whatever. That stuff is ancient history. Both companies have been guilty of releasing shitty products with poor support. Let it the hell go already. It's stupid to argue about, and completely stupid to use as the basis of an argument years later. Both of you just shut up.

Here is only important fact, if AMD goes down, like you wish candle, we will all lose as consumers.
I think if ATI/AMD did go under it would just be IBM that would buy them in bankruptsy court. Then 6 months later(and with IBM's unlimited resources), we would have more competition than we do now. There would be a brief dry period, but that wouldn't last long as IBM is just looking for a good time to step forward and buy all the equipment and infastructure it needs at little cost to them. So it would be IBM vs. NV and Intel. :cool:

I bet IBM could build a better quad than AMD. ;)
The only reason why Intel is pulling ahead so easily is its vast resources it can throw at R&D.

Take that same logic and apply it to IBM vs. Intel..... IBM would give Intel more competition than they would know what to do with. Resources are a big part of the equation.

Nvidia has enough funds to buy AMD/ATI right now but it would be taking with it the large debt that AMD/ATI has on its shoulders. :(
Posted on Reply
#21
Wile E
Power User
by: DaedalusHelios
I think if ATI/AMD did go under it would just be IBM that would buy them in bankruptsy court. Then 6 months later(and with IBM's unlimited resources), we would have more competition than we do now. There would be a brief dry period, but that wouldn't last long as IBM is just looking for a good time to step forward and buy all the equipment and infastructure it needs at little cost to them. So it would be IBM vs. NV and Intel. :cool:

I bet IBM could build a better quad than AMD. ;)
The only reason why Intel is pulling ahead so easily is its vast resources it can throw at R&D.

Take that same logic and apply it to IBM vs. Intel..... IBM would give Intel more competition than they would know what to do with. Resources are a big part of the equation.

Nvidia has enough funds to buy AMD/ATI right now but it would be taking with it the large debt that AMD/ATI has on its shoulders. :(
They might be able to resurrect ATI, but the cpu division would still be out of the picture, leaving us to get raped by Intel. AMD's x86 license is not transferable, so if they go under, so does that license. IBM won't be able to use it.
Posted on Reply
#22
DaedalusHelios
They could buy Cyrix/Via

They could find the money to do that with reducing the amount of free snacks given out at meetings. :laugh:
Posted on Reply
#23
Rebo&Zooty
i think ibm has their own x86 licence tho ;)
Posted on Reply
#24
HTC
by: Rebo&Zooty
i think ibm has their own x86 licence tho ;)
Which is why i believe Intel you do it's best to try and block an ATI (or nVidia) buyout by IBM because that would mean they could try and make a CPU out of a GPU. Or am i thinking wrong?
Posted on Reply
#25
DaedalusHelios
by: Rebo&Zooty
i think ibm has their own x86 licence tho ;)
not yet...... and thats a fact ;)



by: HTC
Which is why i believe Intel you do it's best to try and block an ATI (or nVidia) buyout by IBM because that would mean they could try and make a CPU out of a GPU. Or am i thinking wrong?
Its totally different. Tesla is great for floating point calculations with oil discovery but thats not really a CPU architecture and it works in totally different ways.(thats the closest thing to what you were trying to say I think ;))
Posted on Reply
Add your own comment