• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Potential graphics card shortage incoming... interesting idea. A.I. demand taking up TSMC space.

This demand for machine learning hardware should not come as a surprise to anyone who follows semiconductor technology.

Nvidia has enjoyed a flourishing and fast growing AI business for several years. You can read about various companies and industries and how they use machine learning under the enterprise focused sections of Nvidia's website, companies like grocery chains doing inventory analysis.

Also, Nvidia's datacenter business eclipsed their gaming business a while ago (a year ago? maybe more?).

As we know, from a consumer angle, Nvidia's most prominent machine learning feature has been DLSS so far, but other applications have been in use for a while. I've used both Nvidia Broadcast and Nvidia Canvas for a couple of years.

Apple put machine learning cores (their "Neural Engine") starting on their A11 SoC back in 2017. They opened up access to the machine learning for third-party developers the following year with the A12 SoC.

Without a doubt Nvidia will continue servicing the graphics industry rather than becoming an AI pure play.

Perhaps the most interesting thing about Jensen's SIGGRAPH keynote was the heavy emphasis on the Nvidia Omniverse platform as a cloud technology. You can do the basic prototyping on an AI hardware equipped workstation but do all of the analysis and other heavy lifting on datacenter-hosted cloud systems.

That means Nvidia isn't planning on selling every GPU chip. They rent out GPUs for AI just like Amazon AWS has been renting out excess computing resources for over twenty years.

In the long run, a big corporation will find it more cost effective to buy their own GPU hardware and integrate it into their data centers. But for smaller companies, startups, and newcomers to machine learning, renting GPU cycles from Nvidia is possibility, just like using Amazon EC2 instances for small computing projects.

CAGR for the machine learning market blows doors over the consumer graphics business which plods along. Nothing new there, it has been in Nvidia's quarterly earning statements and the slide decks they post on their site. Nvidia is putting their focus on the market with the most upside in the next ten years. They would be stupid to churn out graphics cards for the DIY consumer PC market since the margins are so much higher elsewhere. Nvidia is a publicly owned company and their number one priority is to increase shareholder value.
 
Last edited:
Even if the demand is high I don't see things being as bad as they were during the pandemic coupled with a Crypto boom.
At least this is being planned for, that was multiple crap-storms colliding all at once.

Already have a 4090 dgaf.

Saturday Night Live 90S GIF


Had to lol.
 
Who did not know this were not paying attention, for one of many examples if nVidia are not relaying on buying video cards they can mark up as much as they like and AMD will follow along with Intel if they get their shit together.

yeah, the continued disappearance of sub 350 GPUs from NVIDIA a year post-etherium at msrp at major retailers is a pretty clear sign where the industry is headed!

the move to 8gb ram/128-bit bus on 4060 /ti is a step in the right direction, but it didn't prevent the 3050 from being 50-100 over msrp since launch!

the only thing that dropped those 3050 prices were the 4060, and now the 4060 looks to be the first value card worth buying in years!

 
Even if the demand is high I don't see things being as bad as they were during the pandemic coupled with a Crypto boom.
At least this is being planned for, that was multiple crap-storms colliding all at once.



Saturday Night Live 90S GIF


Had to lol.

I've actually expected this for a while being the next thing and even if it isn't as bad as the crypto/pandemic shortages we are already paying more than prior generations with pretty much every sku being moved up a tier in pricing or offering little to no improvements gen on gen.

With the recent rumors of AMD ditching the high end and Nvidia solely caring about AI it isn't a very pretty picture for gamers.
 
What is the killer end user app for LLMs and how do I use and / or invest in it?
 
I honestly don't see the economic payback there... it would seem to my noobish experience with AI that it would be much cheaper and more efficient for someone to rent AWS or AZURE rigs for AI than try to build out a gfx farm....

Also isn't the value of AI in the training and the application? What is the application of a small ai farm?

Getting bad already,already fake youtube streamers and some charge to watch too.
 
Cryptocurrency was easy, just buy as many GPUs as you can afford, run a miner and profit. Great for the little guy, not very enticing for corporations since they mostly avoid dynamic and volatile markets. With generative tools, and soon AI, it's different, they're here to stay so corporations will divert manufacturing for their needs and the little guy can still profit from this. I personally know several people who already started building small "AI farms", creating fake social media/internet prostitution media accounts and begun, as they call it, "milking simps" with generated photos, chatbots posing as females and such. As it turns out, it's a much higher ROI than cryptocurrency because this kind of consumer is surprisingly naive and easy to separate from his money - and there's a lot of them. You can start by using available services but, as demand increases, it becomes much more cost effective to run your own. It's also very much a parallelized workload (since you're serving multiple people at once) so splitting it between several "consumer-grade" machines is not an impediment.
That being said, I'm sure that corporations diverting their attention and manufacturing capacity towards the much more profitable market will be felt by consumers more significantly.
 
That all existed for a long time. People are writing their own model and training it, or it's off the shelf code from somewhere? That is once they've got the hardware to run it.

To me this is all the biggest load of bs in a long time. Sure, there are some decent use cases for LLMs. Nothing in it screams some massive game changer, just better chat bots and programming tools for the most part.

There's more utility than blockchain of course, not very difficult!
 
Such things existed, but were much less advanced. Now you can easily generate "good enough" looking photos in seconds and create much more personalized and realistic language models. There is an increasing number of off the shelf solutions which you can easily customize to behave like a single person with a defined personality.

For me one of the best use cases - a game changer so to speak - is something like the "summarizer" in the Brave search. No visiting random websites with bullshit SEO texts to get to information I need, just get straight, short answer.
 
For anyone really interested in this topic, watch Jensen's keynote from this week's SIGGRAPH conference.

There's far more to AI than LLM. In fact, he barely mentioned LLM during his presentation which is understandable since it's a computer graphics conference.
 
Last edited:
That’s been happening for centuries… there are far higher stakes for AI if you’re worried about shallow/deep-fakes
 
That’s been happening for centuries… there are far higher stakes for AI if you’re worried about shallow/deep-fakes

Yeah it was one of many examples, it was just one were i thought i might not get accused of some thing.
 
I honestly don't see the economic payback there... it would seem to my noobish experience with AI that it would be much cheaper and more efficient for someone to rent AWS or AZURE rigs for AI than try to build out a gfx farm....

Also isn't the value of AI in the training and the application? What is the application of a small ai farm?
Exactly my thoughts watching the video... home-built AI farms... yeah, sure, for what purpose?
 
I've actually expected this for a while being the next thing and even if it isn't as bad as the crypto/pandemic shortages we are already paying more than prior generations with pretty much every sku being moved up a tier in pricing or offering little to no improvements gen on gen.

With the recent rumors of AMD ditching the high end and Nvidia solely caring about AI it isn't a very pretty picture for gamers.
Its fine, just remove RT and unreal engine and we can run any game on our current hardware. Gaming graphics have been done for over ten years, really. It shows because all improvement today is in fact regression. We already had all of what we get sold as new today. The best games are in history not on the horizon; this has been the case for years now.
 
Its fine, just remove RT and unreal engine and we can run any game on our current hardware. Gaming graphics have been done for over ten years, really. It shows because all improvement today is in fact regression. We already had all of what we get sold as new today. The best games are in history not on the horizon; this has been the case for years now.

that's all very subjective mate, I mean Endwalker is a relatively new game and is my all time favorite game, and Dawntrail the next game is probably also going to be a lot of fun for me.

and until I can push 150+ fps in all games at 1440p ultra, raytracing turned off, I won't be fully content. I prefer the extra smoothness of high refresh even if you don't
 
that's all very subjective mate, I mean Endwalker is a relatively new game and is my all time favorite game, and Dawntrail the next game is probably also going to be a lot of fun for me.

and until I can push 150+ fps in all games at 1440p ultra, raytracing turned off, I won't be fully content. I prefer the extra smoothness of high refresh even if you don't
Preference and necessity are two entirely different things. In the end you play what you can run, right?

Though content wise sure, new games still get released, but they dont need newer graphics. Much of the graphical improvement we get now is more of a sidegrade, or its a GPU trick like DSR and DLSS, with its own traits. Its barely up to the game now; all engines look fine, its what devs put in the game in terms of art and design what makes the difference - and that echoes down history. Graphics are the technology, not the content, and a means, not a purpose.
 
Preference and necessity are two entirely different things. In the end you play what you can run, right?

Though content wise sure, new games still get released, but they dont need newer graphics. Much of the graphical improvement we get now is more of a sidegrade, or its a GPU trick like DSR and DLSS, with its own traits. Its barely up to the game now; all engines look fine, its what devs put in the game in terms of art and design what makes the difference - and that echoes down history. Graphics are the technology, not the content, and a means, not a purpose.
Where have the times gone when we played Doom 95 Shareware locked at 30 FPS and had tons of fun?
 
Its fine, just remove RT and unreal engine and we can run any game on our current hardware. Gaming graphics have been done for over ten years, really. It shows because all improvement today is in fact regression. We already had all of what we get sold as new today. The best games are in history not on the horizon; this has been the case for years now.

I mean I'm still looking forward to games like Gears of War 6 and Witcher 4 or whatever it's called really pushing the evelope when it comes to visual design. I mean a big reason I bought my 4090 was to play witcher 3 NG maxed at 4k and while the base graphics are more than fine the game is overall more immersive maxed out even some more recent games look quite a bit worse in my opinion than the full RT version of it. Now if people want to play old games good for them with 0 visual improvements good them.

Where have the times gone when we played Doom 95 Shareware locked at 30 FPS and had tons of fun?

We have higher expectations nearly 30 years later. Although plenty of people game on consoles or steamdeck at 30fps and are fine with it. I never liked it not even in the 90s.
 
I mean I'm still looking forward to games like Gears of War 6 and Witcher 4 or whatever it's called really pushing the evelope when it comes to visual design. I mean a big reason I bought my 4090 was to play witcher 3 NG maxed at 4k and while the base graphics are more than fine the game is overall more immersive maxed out even some more recent games look quite a bit worse in my opinion than the full RT version of it. Now if people want to play old games good for them with 0 visual improvements good them.



We have higher expectations nearly 30 years later. Although plenty of people game on consoles or steamdeck at 30fps and are fine with it. I never liked it not even in the 90s.

im a high refresh snob, and many know that here. but steam deck 30 fps capped games look great on Deck, for some reason I honestly don't mind capped games on the Deck, not sure if Linux is smoothing out the frames or what, but it just looks better on Deck at 30 fps than 30 fps on Windows. not sure why.

but yeah I love my steam deck for capped games, compliments my high refresh gaming pc nicely. i imagine steam deck 2 being OLED will improve that w.e it is doing even more. so can't wait for that.
 
I mean I'm still looking forward to games like Gears of War 6 and Witcher 4 or whatever it's called really pushing the evelope when it comes to visual design. I mean a big reason I bought my 4090 was to play witcher 3 NG maxed at 4k and while the base graphics are more than fine the game is overall more immersive maxed out even some more recent games look quite a bit worse in my opinion than the full RT version of it. Now if people want to play old games good for them with 0 visual improvements good them.



We have higher expectations nearly 30 years later. Although plenty of people game on consoles or steamdeck at 30fps and are fine with it. I never liked it not even in the 90s.
We do, and yet, I never really cared much for them. I mean, I enjoyed watching graphics grow from Pong to The Witcher 3, but I never had the feeling graphics make or break a game. Mechanics, content, world/setting makes the game for me. Whether its Mystic Quest on the SNES or Skyrim with six dozen visual enhancers, I cant say I have more fun in Skyrim. On the contrary, perhaps even, because the focus is on 'how good can I make it look' rather than 'how awesome is the gameplay'. Mystic Quest is a similar game with a large game world and RPG stuff, but has actual functional combat, while Skyrim is the same, but all the focus on presentation left us with broken combat, ultra clunky movement and we can jump up the side of mountains for whatever reason. Yay for 3D...

Cyberpunk in RT. I honestly didnt care for it one second. I tried another playthrough on the 7900XT. Its just completely skippable to me, because the content was old. Okay things are more shiny. Its a gimmick the moment I realized it doesnt affect the game, and it really didnt add to immersion either. Immersion happens when the game grabs me, not its graphics, and frankly, CBP looks plasticky, artificial, and stylized rather than hyperrealistic, and RT makes the whole thing a reflective parody of itself almost.
 
Last edited:
Mechanics, content, world/setting makes the game for me.
Right, for me the game is about how it feels, foremost, the atmosphere it gives, the vibes. Had never anything to do with the technology it uses, far more, how well the game was made, how much love the dev put into it, for example.
 
Back
Top