• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Silicon Wafer Pricing Falling for the First Time in Three Years

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
18,866 (2.50/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
Semiconductors are largely made using silicon, even though there are other types of substrates that can be used as well, such as gallium nitride or silicon carbide. However, most semiconductors today are made using silicon wafers, which in turn means that silicon wafers are a key material in the semiconductor industry. Over the past three years, the cost of silicon wafers have increased in pricing, due to higher demand, as there has been a higher demand for semiconductors. However, as there are a limited number of suppliers of silicon wafers, especially at the larger 12-inch size, the increased cost in materials has had an impact on the cost of the final semiconductors.

Reports out of Taiwan are suggesting that the price of 12-, 8- and 6-inch wafers are all starting to see a decline in price. We're talking single digit percentages here and it should be noted that these are spot prices, not contract prices, which are negotiated between the parties a long time before delivery. That said, the fact that the spot prices are point downwards also means that companies with not so great contract pricing are starting to want to renegotiate their contract pricing, as even a small saving here can lead to a bigger saving further down the line. Many IC manufacturers have also asked to pause their contract orders, as the utilisation rate of many foundry nodes are going down, which means the foundries aren't in need of as many wafers as they have ordered. Hopefully this will all lead to lower prices across the board when it comes to semiconductors this year, but it's too early to draw any real conclusions. It's also possible that the end customers won't see any direct benefits from lower costs to the manufacturers.



View at TechPowerUp Main Site | Source
 
Low quality post by Legacy-ZA
That so? I guess it will no longer be $1600 for an RTX4090? :roll:

Nvidia: "Buy an RTX4090 now for only $3000! Limited stock remaining! Artificially induced of course, for your benefit!"
 
This is expected given all the wafer order reductions in Q3/Q4.
 
Looking forward to the next bust in Semiconductor Industry, that would be the perfect time to upgrade for people who waited out 2020-2022. I doubt its going to happen fast though, AMD is still in the phase of managing supply to keep the prices high instead of an all out price war.
 
That so? I guess it will no longer be $1600 for an RTX4090? :roll:

Its not the wafer that's expensive, its the 3nm that's expensive.

IIRC, 3nm requires huge amounts of ultra-pure water and energy to create. They're made from the same wafers as any other process, but are a heck of a lot more expensive.

------------

Wafer costs are a bigger deal to the cheaper processes: think 40nm, 90nm, and above. These are used in car parts, microcontrollers, and other such "lower-tech" computers.
 
A year or so down the road I’m sure this will get named ‘the great price correction’

;)
 
I'll order one soon to make my own CPU and GPU.... :nutkick:
 
The headline they don't want you to see. Yeah we won't see price cuts for CPU's GPU's off the back of this, I'd wager.
 
The guy who eats wafers can now have two on Sundays.
 
I would like to see Blackwell 102 die 800mm2 TSMC 3nm at the end of 2024 for less than 2k :)
 
Its not the wafer that's expensive, its the 3nm that's expensive.

IIRC, 3nm requires huge amounts of ultra-pure water and energy to create. They're made from the same wafers as any other process, but are a heck of a lot more expensive.

------------

Wafer costs are a bigger deal to the cheaper processes: think 40nm, 90nm, and above. These are used in car parts, microcontrollers, and other such "lower-tech" computers.
This is the bigger thing. Much of the current inflation is actually being driven by the tail end of the chip shortage. In the industry I work in, the shortage is pushing the stuff we buy to lead times of over a year, whereas before 2020 it might be a couple of months.

Do you want a basic industrial microcontroller? Sure! Only a few months before our next batch comes in. We will put you on the list for the fourth shipment after that.

Went to buy some ECM (aka industrial PWM) fans not long ago and got quoted an 18 month lead time. These are the kind of thing that the manufacturer usually ships the next day.

Automotive is another key part. There are still thousands upon thousands of cars, otherwise complete and ready, waiting on automotive chips.

The lower prices are an indicator of increased supply - the exact thing we need to hear.
 
It's also possible that the end customers won't see any direct benefits from lower costs to the manufacturers.
Ofc this is about the end user(s). Majority of consumers that are not buying or they have severely reduce their spending = a third or more less revenue for these clowns that produce these products.

Companies, MSM, etc. THINKS their bosses are shareholders, OEMs cherry-picked free-samples issued to the MSM, etc. but what's interesting, though... WISE shareholders knows that their bosses are... CONSUMERS.
 
Its not the wafer that's expensive, its the 3nm that's expensive.

IIRC, 3nm requires huge amounts of ultra-pure water and energy to create. They're made from the same wafers as any other process, but are a heck of a lot more expensive.

------------

Wafer costs are a bigger deal to the cheaper processes: think 40nm, 90nm, and above. These are used in car parts, microcontrollers, and other such "lower-tech" computers.

Irrelevant; same ****, new node. The old nodes were also "new" at one point. :)
 
As long as the engineers of new architectures decide to "improve", "enhanced", characteristics, based mainly on cramming more transistors and more frequency per square millimeter, nothing good awaits us. This is the approach of bodybuilders, not intellectuals.
 
As long as the engineers of new architectures decide to "improve", "enhanced", characteristics, based mainly on cramming more transistors and more frequency per square millimeter, nothing good awaits us. This is the approach of bodybuilders, not intellectuals.

Oh, there are ways to improve current architecture, I just don't think they want to release it just yet, got to keep the artificial "progression" alive and well.
 
Irrelevant; same ****, new node. The old nodes were also "new" at one point. :)

Old nodes require less equipment, cheaper equipment, fewer steps (aka: less labor), less electricity, and open air.

3nm requires more equipment, more expensive equipment, more steps (ie: more labor), more electricity, and ultrapure water (rather than "just" a cleanroom environment).

All of these things cost money. There's a reason why cars are made out of 40nm parts and chips, because these older nodes are far more cost-efficient.
 
@Fourstaff you mean the big 3 surely or are you also on the train of AMD must be the value option, must, and are solely responsible for the price of part's IE they're all at it no?!.

OT it's a wafer not a chip it can be used in 14 Nm or 4Nm and it's not the largest cost of chip manufacturing so won't affect the cost we pay much.

Since it's the process costs that are doubling over time, EUV ain't easy.
 
One thing I couldn't understand. Why do we continue to drill down into nanometers when we are already close to unsaleable prices not because of design cost, but because of increasing manufacturing costs and more and more units that are defective already in production, because of too tiny and unstable elements.
 
One thing I couldn't understand. Why do we continue to drill down into nanometers when we are already close to unsaleable prices not because of design cost, but because of increasing manufacturing costs and more and more units that are defective already in production, because of too tiny and unstable elements.

Because smaller means less electricity per transistor and more transistors in a given area.

TSMC 7nm gave you 91 million-transistors per mm^2. While 5nm gave you 171 million transistors per mm^2 (and each of those 5nm transistors use less electricity than each 7nm transistor).

I don't know how far 3nm improves things, but traditionally its a x2 between full nodes. So we're looking at around 340 million transistors per mm^2 of area on the most recent processes.

----------

The reason why car companies (and other companies) use 40nm or older chips, is because the your tire pressure sensor microcontroller (radio + ADC + logic, its a very small computer) doesn't need to shrink. The vast majority of "computers" you use (Radio interface, car starter, keyfob, tire pressure sensor, antilock brakes, thermostat, toaster oven timer, etc. etc.) are of this class of microcontrollers. They don't shrink, they stay the same for 20+ years at a time. Heck, 40nm is rather luxurious to these, a lot of them are still 180nm or older.

The computers you use as a professional, pro-sumer, or enthusiast do new things. Raytracing requires more transistors to compute where all the light is going. Artificial intelligence requires new circuits (tensor cores) that speed up computations. Spending more transistors to double your core size, or offer 2x, 3x, 4x the cache of older systems is universally beneficial to all video games and applications.
 
So, more and more inefficient transistors. 4X more transistors for 1.5X increased performance.
 
So, more and more inefficient transistors. 4X more transistors for 1.5X increased performance.

1.5x increased performance from the same wafer, for the same electricity, is what most people care about.

IIRC: Google and other datacenters use megawatts of power. Getting 1.5x more performance inside the same electrical usage is well worth the money.

On the other end of things: cell phones. Getting more performance out of the same battery life is also huge, and can only be accomplished with shrinking nodes.
 
About time. Just wish this resulted in us all receiving more affordable tech across the board. Helps create additional profit over time for the manufacturers once they renegotiate their contracts.
 
Back
Top