• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Do you use your GPU for Compute?

Besides Gaming, what do you use GPU Compute for?

  • AI

    Votes: 2,504 9.9%
  • Encoding

    Votes: 3,134 12.4%
  • Rendering

    Votes: 3,008 11.9%
  • Mining Crypto

    Votes: 791 3.1%
  • Folding/BOINC

    Votes: 762 3.0%
  • No compute, just gaming

    Votes: 14,986 59.5%

  • Total voters
    25,185
  • Poll closed .

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,755 (3.75/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
With all the hype around AI, we are wondering.. besides gaming, what do you use your GPU for?

If multiple apply, pick the most important one and maybe let us know in the comments
 
Useful for 3D modelling. I don't do 3D, yet I have a buddy who doesn't have a suitable PC so I render stuff for him once in a month or two. Does that count?
Also occasionally crack my own passwords with that thing just for lulz.
 
Hi,
I use the gpu for compute-r does that count :p
 
Multi selection on the poll would be preferable.
I voted on rendering which is the closest to processing engineering data in software that can be accelerated using cuda cores.
But I also use my gpu in Davinci Resolve and for gaming of course.
 
I run Folding@Home on a couple of GPUs. I also mined eth on the 6800XT (back when I could) to offset its purchase price.
 
I run Folding@home and dabble with Stable Diffusion.
 
This poll should be multiple choice.

I'm probably doing the first three in late 2023. I don't mine crypto (who does on general purpose silicon anymore?) nor have I done any of the folding stuff in the past twenty years.

It also doesn't accommodate for people with multiple systems. My primary gaming build is really just for that. My secondary gaming build is also used for video upscaling, currently with waifu2x, probably with something else in the near future. I've dabbled a couple of times with the Topaz software.

Even when I do a waifu2x upscale, there are clearly periods when the CPU is doing most of the work and other times where both the CPU and GPU are being heavily used. However, I don't really know what silicon the GPU is using at any time, whether it's GPU cores, RT cores, ML cores, media encoding cores, or just a lot of VRAM i/o.

For my daily driver Mac, I'm sure the GPU is used for multiple tasks including ML as well as hardware media encoding and decoding. I'm also using the Upscayl AI image AI tool. I don't really game on the Mac since I have a better suited Windows PC for that (RTX 3080 Ti).

I've also used Nvidia Broadcast and Nvidia Canvas on a couple of my lesser RTX builds. The former is realtime video processing (like background replacement), the latter is just a silly little AI paint program.

I know Apple does some image/video processing on the GPU (Neural Engine cores or GPU cores, probably both) on the Mac, just like on my iPhone and iPad. I'm pretty sure DaVinci Resolve also takes advantage of the differentiated silicon. Same with Pixelmator Pro.

And with each major macOS and iOS/iPadOS release, there are more functions that harness the GPU and ML cores. When I upgrade to the latest macOS and iOS/iPadOS in spring 2024, my GPUs will be doing even more work.

Hell, even web browsers will use hardware acceleration when available. This includes video playback. I know RTX Video Upscaling is an available setting in the Nvidia Control Panel starting with the Ampere generation. And hardware video decoders have been included in CPUs and GPUs for years. So even if you're watching a YouTube video, your machine is probably using differentiated GPU silicon to decode the stream (H.264, HEVC, etc.).

At least one of the video conferencing systems uses ML processing to fake proper eye contact. I know I've tried this out but I don't remember which system I used. My Mac? A Windows PC?

There's also some sort of text recognition feature when analyzing photos. My guess is this is also being offloaded to available GPU/ML cores in 2023 whether it's my phone or my computers.

Really the main moment I am conscious of this is when I'm using Handbrake. If the requisite hardware exists, there will be encoding options for hardware (Intel, AMD, Nvidia, or VideoToolbox which is Apple). Most of the time I don't really know what calculations are being done on which transistors.

When I fire up something like Photos or iMovie on my Mac, iPhone, or iPad, different tasks are using different transistors. As long as the task is executed successfully, I don't really care if they're CPU cores, GPU cores, ML cores, RT cores, media engine cores, whatever. It's up to the software developer to harness whatever resources available to make it work. I think Apple put Neural Engine cores starting from the iPhone 6S generation. And the Neural Engine cores are being used more in 2023 than in 2015.

And some day, someone will add another type of differentiated transistor into a system. And I'll use that as well probably without knowing.

I'm pretty sure there are 6-8+ other tasks that are harnessing GPU/ML cores that I'm not even aware about. And there will be more in 2024 whether it's something built into an operating system API or a standalone feature in a specific application.

Unless you only game on your system, it is highly unlikely in late 2023 that your GPU is only being used during gaming. If you do anything else on your system, you are probably using the GPU for non-gaming tasks. You just don't realize that software developers have moved some functionality to these transistors while you were playing your game.
 
Last edited:
i do just about them all apart from mining, dont know which to pick with just one choice bud.
 
Crunching response rate makes me sad, though it's at least higher (as of post) than crypto...
 
I like to take clips of my best gaming moments. I use Action! recorder (Steam) with NVENC as a full replacement for GeForce Experience and ShadowPlay pretty much all the time. Ada's NVENC is such a massive leap over Turing/Ampere's, it's amazing. Other than that, I do GPU transcoding of said videos sometimes.
 
Crunching response rate makes me sad, though it's at least higher (as of post) than crypto...
Crunching doesn't really make much sense in this day. If you want to contribute to these efforts, donate cash, fully appreciated securities, etc.

At least you can claim part of the donation as a tax deduction. The electricity you burn while crunching cannot be written off. Neither can the hardware. You're better of buying shares of SPY and QQQ, holding them a year, then donating them.

And these large organizations can get electricity rates cheaper than retail residential rates (which are generally the highest of all utilities customers). And yes, they can acquire computing hardware at lower cost and put them in facilities at better rates than Joe Consumer.

If you really feel like you need to burn up some electrons to participate, find the lowest power & most energy efficient device and use one single device to crunch your work unit.

Disclaimer: I crunched my last SETI@Home work unit probably around 1999 or 2000. It was on a Linux box that much I remember clearly.
 
Last edited:
I voted AI since I use Stable Diffusion, but I also use it for rendering in Blender and transcoding in Handbrake.

I've had to go back to 23.7.2 because later drivers seem to be broken with GPU rendering in Blender. At least with that older version, it works fine in 3.6.7.

4.0.2 seems to have broken all sorts of stuff on AMD.
 
Had been using it for handbrake and "backing up" bluerays but i've switched over to using intel quicksync now (I guess that's still a GPU)
 
Crunching doesn't really make much sense in this day. If you want to contribute to these efforts, donate cash, fully appreciated securities, etc.

At least you can claim part of the donation as a tax deduction. The electricity you burn while crunching cannot be written off. Neither can the hardware. You're better of buying shares of SPY and QQQ, holding them a year, then donating them.

And these large organizations can get electricity rates cheaper than retail residential rates (which are generally the highest of all utilities customers). And yes, they can acquire computing hardware at lower cost and put them in facilities at better rates than Joe Consumer.

If you really feel like you need to burn up some electrons to participate, find the lowest power & most energy efficient device and use one single device to crunch your work unit.

Disclaimer: I crunched my last SETI@Home work unit probably around 1999 or 2000. It was on a Linux box that much I remember clearly.

But none of that keeps my office warm in the winter. Also, BOINC supports small projects that don't necessarily have institutional support and resources. Getting inefficient donated compute time from randos on the Internet is still cheaper than buying it from a cluster. Someone made a similar argument in another thread, and I can't help but wonder: if distributed computing is that decisively disadvantageous vs. cluster computing, why is it still going on?
 
Someone made a similar argument in another thread, and I can't help but wonder: if distributed computing is that decisively disadvantageous vs. cluster computing, why is it still going on?

The primary value of distributed computing projects is public awareness not the crunching itself. Ultimately the goal of awareness is community involvement.

However in most cases, opening up your checkbook is actually a more useful way to get involved than crunching work units since cash can be used in many different ways. As we have touched on the buying power at a institutional level is far better than at the individual consumer level.

For smaller and newer projects, I think there's a case for distributed computing projects, simply to get the issue in the public eye to a certain segment of the population who might be unaware of such matters.

There's a cost and overhead associated with all of these distributed computing programs. They aren't free. Could the money used to collect data, hire staff, distribute, process, analyze be used more strategically for other things? At the end of the day, these are all fundamentally marketing projects to increase awareness.

It's up to each organization to review their distributed computing project periodically and evaluate whether or not it fits in with the overall mission. SETI@home is one project that has terminated public involvement at this time. There's no one single approach for every organization that applies at the same moment for everyone.

But without a doubt, these distributed computing projects are a tiny fraction of the overall budget of the larger organization behind the effort. In the end, a monetary donation (or fully appreciated equities) goes farther because the funds can be allocated to a far greater number of usage cases.

Again, it's hard to ignore the tax implications from a consumer standpoint. If I give American Cancer Society one dollar in cash, I can claim some of that as a deduction. If I give American Cancer Society one dollar in electricity at residential rates (including hardware depreciation), I can't claim any of that.

Your tax advisor isn't going to ask you how many work units you crunched. But they will ask you if you made any charitable contributions. The federal and state governments recognize those contributions.

Maybe you get a lot of satisfaction seeing your BOINC work unit counter increment. Pat yourself on the back and give yourself an "attaboy". If you like that more than seeing a $100 check cashed in your checking account, stick with the distributed computing. Or do both if you like.
 
Last edited:
multiselection where
Unfortunately the polls engine doesn't allow multiselect (I wrote it, so purely my fault)

Vote for the one that you use the most, and maybe leave a comment
 
Unfortunately the polls engine doesn't allow multiselect (I wrote it, so purely my fault)

Vote for the one that you use the most
, and maybe leave a comment
Hi,
I don't see a fault actually
Not surprising gaming is the main reason someone would buy or need a gpu.anything else and they would select a different option.

A secondary option would just cloud the polls results making the poll pretty much useless.
 
Not surprising gaming is the main reason
I didn't ask "main reason = gaming", i asked "which main compute?" one possible answer is "no compute, just gaming". should i reword?
 
Hi,
I was referring to the results so far
1703958958953.png
 
just Encoding, i don't care about AI things or mining crypto
maybe rendering sometimes but not in heavy use

 
I answered rendering, but it does sometimes get used for encoding and in the winter I'll put it on F@H whenever it's not doing that. Even with all that though I'd say 80% gaming in the end.
 
Back
Top