• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Completes Acquisition of Silo AI

Saying this is like the dotcom bubble isn't the negative implication you seem to think it is. Online retail rose during the dotcom bubble and became successful because of the funding it provided:


"In 2000, the dot-com bubble burst, and many dot-com startups went out of business after burning through their venture capital and failing to become profitable.[5] However, many others, particularly online retailers like eBay and Amazon, blossomed and became highly profitable."

This is the same for any new market, a ton of startups pop up and most of them go out of business finding what does and doesn't work.

If you are implying that AI has no use you'd be dead wrong, just from a medical and engineering perspective it's already indispensable.

The AI of everything will burst.

Bubble? Burst? It was a medium-size correction before the cancer-like growth that hasn't stopped until now, it just ceased a bit in 2023. One more bubble burst like that, and Nvidia + Amazon + Apple + Microsoft + Google + a couple AI newcomers will constitute 80% of the world economy before 2050.


Its a bubble that is going to burst once people realize how little actual work it is doing and how many resources its sucking up for mediocre AI BS, it can't make me coffee yet. The dot com bubble burst when people didn't flock to all the websites that were flashy but did nothing, this will be the same, then it will be put to use where it should, drug research, climate modeling, DNA function modeling, physics. The fact that even with a stack of hardware it doesn't run the data density means its never going to scale to truly functional status with current silicon limitations, except in some supercomputers or for niche applications. A company I work with uses machine learning to determine seed health/viability in almost real time, and it takes a whopping 2 GPU's. I don't need AI to text my wife and ask whats for dinner.
 
The AI of everything will burst.

That is not going to happen given AI is already used in designing computer chips and many other products.

As an example, AI based OPC provides a much higher quality lithography mask that will be more resistant to errors and variation. Part of that variation is down to the difference between a design and what the lithography equipment actually produces. Of which AI based OPC is more capable of accounting for and adjusting the design so that it more correctly reflects the intended design.

In addition, there are already products on the market that utilize AI based reductive design (in fact you can checkout the Titan fingertip mouse for an example). This is a process by which a design is optimized by an AI through the removal of necessary material. This single use of the technology alone has the potential to reduce cost and waste on a massive scale.

The fact that even with a stack of hardware it doesn't run the data density means its never going to scale to truly functional status with current silicon limitations, except in some supercomputers or for niche applications. A company I work with uses machine learning to determine seed health/viability in almost real time, and it takes a whopping 2 GPU's. I don't need AI to text my wife and ask whats for dinner.

I wouldn't take a non-purpose built device like a GPU as a true measure of what future AI ASICs will be able to acomplish perf per watt wise.
 
Saying this is like the dotcom bubble isn't the negative implication you seem to think it is. Online retail rose during the dotcom bubble and became successful because of the funding it provided:


"In 2000, the dot-com bubble burst, and many dot-com startups went out of business after burning through their venture capital and failing to become profitable.[5] However, many others, particularly online retailers like eBay and Amazon, blossomed and became highly profitable."

This is the same for any new market, a ton of startups pop up and most of them go out of business finding what does and doesn't work.

If you are implying that AI has no use you'd be dead wrong, just from a medical and engineering perspective it's already indispensable.
Well my research group was basically wiped out and hundreds of us lost our jobs. And we were a high-tech optics group doing fibre optics etc not some retailer.
 
As an example, AI based OPC provides a much higher quality lithography mask that will be more resistant to errors and variation. Part of that variation is down to the difference between a design and what the lithography equipment actually produces. Of which AI based OPC is more capable of accounting for and adjusting the design so that it more correctly reflects the intended design.

In addition, there are already products on the market that utilize AI based reductive design (in fact you can checkout the Titan fingertip mouse for an example). This is a process by which a design is optimized by an AI through the removal of necessary material. This single use of the technology alone has the potential to reduce cost and waste on a massive scale.

As I've said before: its never called "AI" when its a true product with an actual use.

Various steps of chip manufacturing are called automated theorem proving, binary decision diagrams, verification, layout and routing, and more. They were once called "AI" in the 1980s back when there was another AI Bubble, but today these "old AI" tricks have proper terms that everyone understands and has an ability to describe why one algorithm is better than another.

Ex: Automated Layout and Routing could be tuned for less space, or higher performance for different chip designs. Its no longer "AI", its just a tool used by chip manufacturers.

And mind you: these tools don't need data-centers full of H100s in them. The tool that's actually kinda-sorta useful for this task is x3D Cache EPYC servers IIRC.

----------

Today's modern AI is summarized as:

1. Buy (or rent) NVidia chips

2. Run them with huge amounts of electricity.

3. ???!?!?!?!????

4. Profit.

And everyone seems to trip up at what step#3 entails exactly. Don't get me wrong, there's plenty of legitimate uses at step#3. But in my experience, the people who know what that task is don't call it "AI". They call it "Bin Packing", or "NP Complete solution", or "Automated Routing". More specific words that other technical people encode so that we know who the bullshitters vs real wizards are.

Not that I'm a high-wizard or anything. But I've seen them talk once or twice.
 
More specific words that other technical people encode so that we know who the bullshitters vs real wizards are.
Showtime Recording GIF by CBS
 
As I've said before: its never called "AI" when its a true product with an actual use.

This is for the most part true but it's an entirely separate argument.

Various steps of chip manufacturing are called automated theorem proving, binary decision diagrams, verification, layout and routing, and more. They were once called "AI" in the 1980s back when there was another AI Bubble, but today these "old AI" tricks have proper terms that everyone understands and has an ability to describe why one algorithm is better than another.

Ex: Automated Layout and Routing could be tuned for less space, or higher performance for different chip designs. Its no longer "AI", its just a tool used by chip manufacturers.

And mind you: these tools don't need data-centers full of H100s in them. The tool that's actually kinda-sorta useful for this task is x3D Cache EPYC servers IIRC.

The examples I pointed out were not traditional algorithm based techniques and they were certainly not available in the 1980s. The advancements I pointed out are specifically based on Neural Networks and further enhance OPC and manufacturing beyond what is capable via a traditional algorithm.
 
The examples I pointed out were not traditional algorithm based techniques and they were certainly not available in the 1980s. The advancements I pointed out are specifically based on Neural Networks and further enhance OPC and manufacturing beyond what is capable via a traditional algorithm.

Oh, so it's faster and better than the old 1980s algorithm?

Why are we pouring Billions of dollars into GPUs and using so much electricity that the capacity auction on the East Coast just jumped 900% this past year?

That's why we need to get specific here. There is a lot of bullshit involving a lot of money here. A faster and better algorithm leads to FEWER computers needed and LESS power usage.

Not.... Whatever the hell is going on in Virginia's data center expansion or the ridiculous sums of money sent to NVidia.
 
And everyone seems to trip up at what step#3 entails exactly.
Not everyone, although the monetization part is still tricky. Given enough hours of training these robots on menial jobs you can basically kiss Foxconn cheap(?) slave factories goodbye :ohwell:
 
Not everyone, although the monetization part is still tricky. Given enough hours of training these robots on menial jobs you can basically kiss Foxconn cheap(?) slave factories goodbye :ohwell:




We've had robots for years. None of these systems require $Billion GPU Data centers.

The vast majority of electronics work at Foxconn and other Chinese factories are Pick and Place robots, reflow robots, automated testing robots, automatic XRay inspection robots.

Oh right, and it was all automated back in the Sony Walkman days. Remember lights out manufacturing?? That was back in the 90s when the factory would work without any humans in it so there was no need to install lightbulbs in the factory anymore. That was 30 years ago.

------------------

In any case, industrial automation (be it, palletizers, depalletizers, automated farm equipment, pick-and-place machines, reflow ovens, PCB manufacturing, verification and test. etc. etc. etc.) are "not AI". They don't even have GPUs or neural nets in them. Its just normal algorithms and regular ol' control theory.

Going back to my earlier statement: the issue is:

#1: Buy (or rent) $Billion worth of GPUs.

Which necessarily requires $Billion in profits (not revenue) before that's a worthwhile investment. And when #2 is "Shove $hundred-millions of electricity into those GPUs, its going to be difficult to turn a profit.
 
Last edited:
Back
Top