• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Would you pay more for hardware with AI capabilities?

Would you pay more for hardware with AI capabilities?

  • Yes

    Votes: 2,086 7.3%
  • No

    Votes: 24,247 84.3%
  • Don't know

    Votes: 2,413 8.4%

  • Total voters
    28,746
  • Poll closed .
You might not hate the AI so much if you only had a
 

Attachments

  • WBRNMiJkaYxDkJD3VIlD--1--9gl54.jpg
    WBRNMiJkaYxDkJD3VIlD--1--9gl54.jpg
    540.1 KB · Views: 93
Having taken a look at the results of the poll thus far, 83%+ say no thank you to AI.

You watching @ microsoft, @ apple, @Google?? Hmmm?? Most of us don't care and don't want it on our devices/PC's! Take a hint! DON"T force it on us. We will react poorly and to your detriment.

You might not hate the AI so much if you only had a
Hate is not the right word or description. We just don't care and don't want it on our machines. The people of the world have been fine before it, we'll continue being fine without it.

I'm not saying AI and such don't have usefulness, just that it doesn't belong on our personal devices by default.
 
Last edited:
Having taken a look at the results of the poll thus far, 83%+ say no thank you to AI.

You watching @ microsoft, @ apple, @Google?? Hmmm?? Most of us don't care and don't want it on our devices/PC's! Take a hint! DON"T force it on us. We will react poorly and to your detriment.


Hate is not the right word or description. We just don't care and don't want it on our machine. The people of the world have been fine before it, we'll continue being fine without it.

I'm not saying AI and such don't have usefulness, just that it doesn't belong on our personal devices by default.
Someone correct me if I'm wrong, but I don't think this AI craze is about us at all. It all started with Nvidia putting Tensor cores into Volta, and then Turing for datacentre use, and then it continued with other companies putting AI into their architectures for the same purpose. Nearly every current-gen CPU and GPU has some form of AI capability now, but there is no use case for us, home users. Sure, we've got DLSS, but we've also got FSR which works without AI. The only reason we have these "useless cores" is because home PC architectures trickle down from professional/datacentre ones, as developing separate architectures for separate use cases would cost too much money and effort. The problem is that all this AI development hurts advancements in other areas, AI cores take up die area, engineering teams spend time working on AI instead of something else, etc, and without those advancements, these companies have no choice but to try to sell their products by advertising AI, however useless it is to us. They either do this, or skip AI altogether, which would hurt them a lot more on the datacentre front. So then, the only choice companies have is to use the "you'll hear about it repeatedly until you end up liking it" approach in their home PC marketing.

Edit: It's strikingly similar to the crypto craze, which was also never meant to be the Regular Joe's bread and butter, but instead, a way for a handful of mining farm operators to get filthy rich.

Edit 2: I also think that DLSS was an afterthought to have something to sell GeForce cards with Tensor cores by - proven by the existence of the 16-series. Nvidia probably had it as a plan B in case AI wouldn't stick.
 
Last edited:
Someone correct me if I'm wrong, but I don't think this AI craze is about us at all. It all started with Nvidia putting Tensor cores into Volta, and then Turing for datacentre use, and then it continued with other companies putting AI into their architectures for the same purpose. Nearly every current-gen CPU and GPU has some form of AI capability now, but there is no use case for us, home users. Sure, we've got DLSS, but we've also got FSR which works without AI. The only reason we have these "useless cores" is because home PC architectures trickle down from professional/datacentre ones, as developing separate architectures for separate use cases would cost too much money and effort. The problem is that all this AI development hurts advancements in other areas, AI cores take up die area, engineering teams spend time working on AI instead of something else, etc, and without those advancements, these companies have no choice but to try to sell their products by advertising AI, however useless it is to us. They either do this, or skip AI altogether, which would hurt them a lot more on the datacentre front.
Considering the apparent popularity of certain local AI uses like roleplaying, starting off with the early popularity of AIDungeon 2 - Does anyone remember that? - now done with things like finetuned Llama 3 models, and the number of AI-generated avatars on this very forum*, I'd say it's good to have some options that run local, as opposed to just relegating them to API and server space, or do something like what Intel did by dummying out AVX512 on client processors (Not that GPU makers didn't dummy out FP64, but for entirely unrelated reasons.)

Not that it mattered for the greater majority of users, as you observed. Until more visible use of generative AI for home users pops up, and a lot of people are certainly trying. Not that they would produce anything groundbreaking or even useful, but the hardware capability has to be there first.

*My own was grabbed off HF when they ran SD 1.5 demo.
 
Considering the apparent popularity of certain local AI uses like roleplaying, starting off with the early popularity of AIDungeon 2 - Does anyone remember that? - now done with things like finetuned Llama 3 models, and the number of AI-generated avatars on this very forum*, I'd say it's good to have some options that run local, as opposed to just relegating them to API and server space, or do something like what Intel did by dummying out AVX512 (Not that GPU makers didn't dummy out FP64, but for entirely unrelated reasons.)

Not that it mattered for the greater majority of users, as you observed. Until more visible use of generative AI for home users pops up, and a lot of people are certainly trying. Not that they would produce anything groundbreaking or even useful, but the hardware capability has to be there first.

*My own was grabbed off HF when they ran SD 1.5 demo.
Sure, but these weren't anything anyone asked for - nor did they exist during the Turing debut. It's more of a "since we have it, let's use it for something" approach to AI instead of a burning need now fulfilled.
 
Sure, but these weren't anything anyone asked for - nor did they exist during the Turing debut. It's more of a "since we have it, let's use it for something" approach to AI instead of a burning need now fulfilled.
Arguably the whole thing actually started with Tesla/GeForce 8, with unified shaders and CUDA, when people started to realize that video hardware can have more uses than pixel-flinging. Cue supercomputers built with thousands of slightly-modified GPUs in the 2010s topping the TOP500.

Again, the capabilities would have to be there first. I think there is always an unspoken and not necessarily burning need for better interactivities in video games, considering the kind of hype then associated with AI and atmospheric actions in games from Oblivion to Skyrim, before graphics with HD textures, fancy shaders, and ray-traced effects took over. Current advancements have not yet really percolated into that space, outside tech demos.
 
Someone correct me if I'm wrong, but I don't think this AI craze is about us at all. It all started with Nvidia putting Tensor cores into Volta, and then Turing for datacentre use, and then it continued with other companies putting AI into their architectures for the same purpose. Nearly every current-gen CPU and GPU has some form of AI capability now, but there is no use case for us, home users. Sure, we've got DLSS, but we've also got FSR which works without AI. The only reason we have these "useless cores" is because home PC architectures trickle down from professional/datacentre ones, as developing separate architectures for separate use cases would cost too much money and effort. The problem is that all this AI development hurts advancements in other areas, AI cores take up die area, engineering teams spend time working on AI instead of something else, etc, and without those advancements, these companies have no choice but to try to sell their products by advertising AI, however useless it is to us. They either do this, or skip AI altogether, which would hurt them a lot more on the datacentre front. So then, the only choice companies have is to use the "you'll hear about it repeatedly until you end up liking it" approach in their home PC marketing.

Edit: It's strikingly similar to the crypto craze, which was also never meant to be the Regular Joe's bread and butter, but instead, a way for a handful of mining farm operators to get filthy rich.

Edit 2: I also think that DLSS was an afterthought to have something to sell GeForce cards with Tensor cores by - proven by the existence of the 16-series. Nvidia probably had it as a plan B in case AI wouldn't stick.
You make some compelling points.

Still, I don't like the push from the "big-wigs" to get "AI" on our devices. I don't trust AI or them. It's ultimate usefulness has yet to be proven and we don't need the software side of AI bogging down our devices/systems.
 
Someone correct me if I'm wrong, but I don't think this AI craze is about us at all. It all started with Nvidia putting Tensor cores into Volta, and then Turing for datacentre use, and then it continued with other companies putting AI into their architectures for the same purpose. Nearly every current-gen CPU and GPU has some form of AI capability now, but there is no use case for us, home users. Sure, we've got DLSS, but we've also got FSR which works without AI. The only reason we have these "useless cores" is because home PC architectures trickle down from professional/datacentre ones, as developing separate architectures for separate use cases would cost too much money and effort. The problem is that all this AI development hurts advancements in other areas, AI cores take up die area, engineering teams spend time working on AI instead of something else, etc, and without those advancements, these companies have no choice but to try to sell their products by advertising AI, however useless it is to us. They either do this, or skip AI altogether, which would hurt them a lot more on the datacentre front. So then, the only choice companies have is to use the "you'll hear about it repeatedly until you end up liking it" approach in their home PC marketing.

Edit: It's strikingly similar to the crypto craze, which was also never meant to be the Regular Joe's bread and butter, but instead, a way for a handful of mining farm operators to get filthy rich.

Edit 2: I also think that DLSS was an afterthought to have something to sell GeForce cards with Tensor cores by - proven by the existence of the 16-series. Nvidia probably had it as a plan B in case AI wouldn't stick.
Arguably, it started way before that, when Nvidia realized computers are meant for, well, computing and started their own compute stack. Tensors were just an evolutionary step, they enabled something that was much more difficult to do before and here we are now. Hating on AI is like hating AVX512, SSE or the floating point accelerator.
 
Last edited:
Hating on AI is like hating AVX512, SSE or the floating point accelerator.
Except that AVX & SSE are instruction sets within the inner workings of a processor. AI is much more than that. It has the potential for good or evil. Look at it this way, we'll borrow your analogy, AVX & SSE are tools, like a hammer. Alone they do nothing and are benign. AI is like a human. Given a human a hammer and we can do bother good or evil with it. But at the end of the day, a hammer is only ever going to be a hammer. It's the human using it that determines what the outcome will be. It's the same with AI. Once programed by a human, the result produced by AI and the hardware it runs on will show whether the result is good or bad.
 
Except that AVX & SSE are instruction sets within the inner workings of a processor. AI is much more than that. It has the potential for good or evil. Look at it this way, we'll borrow your analogy, AVX & SSE are tools, like a hammer. Alone they do nothing and are benign. AI is like a human. Given a human a hammer and we can do bother good or evil with it. But at the end of the day, a hammer is only ever going to be a hammer. It's the human using it that determines what the outcome will be. It's the same with AI. Once programed by a human, the result produced by AI and the hardware it runs on will show whether the result is good or bad.
At this moment, AI is also just a series of computation (i.e. a tool). It has no more potential for good or evil than your CPU's AVX computations being used to optimize the heating in your home or optimizing the trajectory of a ballistic missile.

Publicly available LLMs do not learn on their own. It's why you have GPT 3, 3.5, 4, 4o and so on. They're models validated by humans that will not step outside their approved boundaries. You cannot even query them without applying corrections, they're a looong way from doing anything autonomously.

Yes, there is potential of harm, but which invention/advancement doesn't carry that? And yes, there is fear, but it's instinctual: humans are afraid of anything they don't understand.
 
Or can't control. Both are perfectly natural. It is the responsibility of the creators of AI tools to show not only what can be done, but also that those tools can be trusted not to cause harm.
You cannot prove something cannot be trusted to not cause harm. Are you familiar with Asimov's "The Naked Sun"?
 
You make some compelling points.

Still, I don't like the push from the "big-wigs" to get "AI" on our devices. I don't trust AI or them. It's ultimate usefulness has yet to be proven and we don't need the software side of AI bogging down our devices/systems.
Is there even a software side in the consumer space besides Microsoft's pitiful attempts with their Copilot bullshit?

As with all unknowns, our choice with AI is to fear it or learn it. The second one seems wiser to me considering that all three chip designer companies are heavily invested in it, so AI is going to stay whether we like it or not. Based on what I've gathered so far, AI is nothing more than a fancy name for a new type of processing cores besides our usual INT and FP units, made for matrix calculations. It is not intelligent, self-aware or self-sufficient, and it has no sense of morality or anything else whatsoever. It computes whatever we, humans want to compute, just like any other part of your PC. Without software, it sits unused.

Or can't control. Both are perfectly natural. It is the responsibility of the creators of AI tools to show not only what can be done, but also that those tools can be trusted not to cause harm.
Tools can be trusted to do no harm only as much as we, humans can be trusted not to use our tools to do harm.

Unfortunately (or not?), we live in an age when technological advancement is multitudes faster than the rate at which humanity's readiness for it is growing. Where it gets us, we'll see. Ride or die, I guess. :ohwell:
 
Is there even a software side in the consumer space besides Microsoft's pitiful attempts with their Copilot bullshit?

As with all unknowns, our choice with AI is to fear it or learn it. The second one seems wiser to me considering that all three chip designer companies are heavily invested in it, so AI is going to stay whether we like it or not. Based on what I've gathered so far, AI is nothing more than a fancy name for a new type of processing cores besides our usual INT and FP units, made for matrix calculations. It is not intelligent, self-aware or self-sufficient, and it has no sense of morality or anything else whatsoever. It computes whatever we, humans want to compute, just like any other part of your PC. Without software, it sits unused.
There is a push, but it's coming from marketing more than anything else. Any program using the crappiest of models, or even something slightly more involved than if/then/else gets an AI sticker these days. For example, what I'm working on right now got an "AI assistant" because we threw a bunch of generic, domain-oriented, documents at a model and it can answer a few very basic questions now. If you don't do that, investors will think you're lagging and invest their money elsewhere.
And once again: really nothing for the end user to worry about.
 
Is there even a software side in the consumer space besides Microsoft's pitiful attempts with their Copilot bullshit?

As with all unknowns, our choice with AI is to fear it or learn it. The second one seems wiser to me considering that all three chip designer companies are heavily invested in it, so AI is going to stay whether we like it or not. Based on what I've gathered so far, AI is nothing more than a fancy name for a new type of processing cores besides our usual INT and FP units, made for matrix calculations. It is not intelligent, self-aware or self-sufficient, and it has no sense of morality or anything else whatsoever. It computes whatever we, humans want to compute, just like any other part of your PC. Without software, it sits unused.
There is a push, but it's coming from marketing more than anything else. Any program using the crappiest of models, or even something slightly more involved than if/then/else gets an AI sticker these days. For example, what I'm working on right now got an "AI assistant" because we threw a bunch of generic, domain-oriented, documents at a model and it can answer a few very basic questions now. If you don't do that, investors will think you're lagging and invest their money elsewhere.
And once again: really nothing for the end user to worry about.
For programs that are actually on the frontier of local AI performance, and actually on the power user end of consumer space, Ollama can be good. SOTA models of 70B+ size and their finetunes can produce some fun and mostly self-consistent story on prompt, game-master better RPs than any AIDungeon 2 or 3 ever did back in the day, and hold their end on a discussion like this reasonably well, generally making sense with few glaring errors and hallucinations, while sometimes offering specific insights beyond the obvious. You can run those models reasonably well with 64GB of RAM or more, if you would tolerate the slow inference.

Notably, that software and its backend currently have no support for dedicated NPUs, and are already memory bound on most current systems, as far as I'm aware.
Or can't control. Both are perfectly natural. It is the responsibility of the creators of AI tools to show not only what can be done, but also that those tools can be trusted not to cause harm.
Tools can be trusted to do no harm only as much as we, humans can be trusted not to use our tools to do harm.

Unfortunately (or not?), we live in an age when technological advancement is multitudes faster than the rate at which humanity's readiness for it is growing. Where it gets us, we'll see. Ride or die, I guess. :ohwell:
That has been going on for a while now for tech products. In 90's it was naughties and FPS, slightly later it was - and still is - social networks and associated ills and things like phone use where they shouldn't be, now it's this. Never before has it been regarded as an existential risk by a whole lot of otherwise reasonable people, though.

Problems with abuses of lesser generative AIs has already become evident, even when these tools don't actually have a will of their own.

As to tools not causing harm, observe the reception of Stable Diffusion 3 and how tools not causing potential harm can also become tools not (as) useful for its designed purpose, even though that specific example was more or less inevitable.
 
Last edited:
First I heard of AI dungeon sounds neat. I'm going to take a peek at it just out of curiosity.

You decide to test the blade's power on this unsuspecting creature. With a swift motion, you swing the weapon, feeling the blood-red jewel's energy coursing through you. The blade sings through the air, leaving a trail of shimmering energy in its wake. The creature, sensing danger, turns to face you, its eyes wide with fear. But it's too late. The blade connects with a sickening thud, and the creature collapses, lifeless. You stand there, panting slightly, taking in the aftermath of your first kill. The creature lies still at your feet, its lifeless body a testament to the power of the relic.

I am a bad man XD
 
Last edited:
You cannot prove something cannot be trusted to not cause harm. Are you familiar with Asimov's "The Naked Sun"?
Testing and open disclosure easily go a long distance to demonstrate.

Is there even a software side in the consumer space besides Microsoft's pitiful attempts with their Copilot bullshit?
Apple has the "Apple Intelligence". Google is working on something for Android and ChromeOS.
 
The GPU is designed for running HLSL which it is good at. Some weather applications can leverage the GPU to render climate models but the scope is limited
 
voted "dont know"

At some point when its use is compellingly useful... yeah why not.

If AI automation wipes my ass after a dump without leaving the gaming chair, i'm IN!
 
Hell no.
AI is a buzz word.

Not that I have locked into it much but is there an open standard that works?
Also since I dislike big Green anything they are pushing and any closed standards I'm against
 
I have no use for AI or any interest in it for now so no.
Its apart of hive mind/skynet bs, no thanks
 
By the time "AI" becomes a thing for the local user, all new hardware on the market will already have an NPU. so whoever builds a system with new components will have no choice but to buy "AI" hardware. So, this poll wouldn't make sense for that future.
 
Nope GPT4 is still not good enough maybe after 2 more iteration will reach usable level, at this Point I can pay extra to run it locally.
 
Back
Top