• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Do you use your GPU for Compute?

Besides Gaming, what do you use GPU Compute for?

  • AI

    Votes: 2,504 9.9%
  • Encoding

    Votes: 3,134 12.4%
  • Rendering

    Votes: 3,008 11.9%
  • Mining Crypto

    Votes: 791 3.1%
  • Folding/BOINC

    Votes: 762 3.0%
  • No compute, just gaming

    Votes: 14,986 59.5%

  • Total voters
    25,185
  • Poll closed .
Just gaming. When I had my previous card (1080 Ti), I did some testing with AI image generation via CUDA, but I just couldn't get Stable Diffusion to work with AMD.
 
For Stable Diffusion you need Quadro. Even the RTX 4090 has its limitations and it is not recommended to use a GeForce for intensive work.
 
Sometimes I use AVC-Free for basic video editing. My craptop has, in addition to its Radeon Vega 8 IGP, a discrete GeForce GTX 1050M. It does a decent job of editing with AVC-Free as well. It's nowhere close to as fast as any desktop GPU but it's fast enough to be usable. :D

Of course, my RX 7900 XTX is wickedly fast but I only do basic video editing (mostly just cutting humourous clips out of videos or re-encoding video clips that are corrupt) once in a blue moon anyway.

For 99.99999% of the time, I'm strictly a gamer.

I run Folding@Home on a couple of GPUs. I also mined eth on the 6800XT (back when I could) to offset its purchase price.
Yeah, I did the same thing with my RX 6800 XT. I threw it into a makeshift mining rig with my RX 5700 XT. It did pretty well.

Just gaming. When I had my previous card (1080 Ti), I did some testing with AI image generation via CUDA, but I just couldn't get Stable Diffusion to work with AMD.
The great thing about the world is that, sooner or later, someone will figure out how to do exactly that. Check these out:
I'm actually interested to try it now.
 
Last edited:
I would like to buy AMD video cards, but I only buy Nvidia because Nvidia cards encode videos with better quality.
For me, encoding a video with better quality is the decisive factor of choice.
Both AMD and Nvidia offer good gaming performance, but when it comes to video encoding quality, Nvidia's GPUs have superior quality.

----------------------------

Do you know if Nvidia video cards still have that problem with low image quality when outputting HDMI? I remember that the Radeons had perfect image quality through the HDMI output, but the Geforce had a bad image through HDMI (the Geforce only had a good image through the DP outputs).
 
Last edited:
Do you know if Nvidia video cards still have that problem with low image quality when outputting HDMI? I remember that the Radeons had perfect image quality through the HDMI output, but the Geforce had a bad image through HDMI (the Geforce only had a good image through the DP outputs).
Never experienced that issue and my newest (and last) Nvidia card was a 1080 Ti from 2017. Perfect picture from DVI, HDMI and DP :confused:
 
Do you know if Nvidia video cards still have that problem with low image quality when outputting HDMI? I remember that the Radeons had perfect image quality through the HDMI output, but the Geforce had a bad image through HDMI (the Geforce only had a good image through the DP outputs).

This past week I was swapping out Nvidia and AMD GPUs, connecting to my monitor thru HDMI (as main PC was monopolizing the DP input) and I saw no difference in HDMI video quality on a 5600 XT, 2060 Super, 6600 XT, and 1080.
 
Do you know if Nvidia video cards still have that problem with low image quality when outputting HDMI? I remember that the Radeons had perfect image quality through the HDMI output, but the Geforce had a bad image through HDMI (the Geforce only had a good image through the DP outputs).
That depends on your display, too. No GPU has any problem driving my monitor, but my 4K TV has much better colours with Intel or AMD GPUs than with Nvidia. 4K 60 Hz with 8-bit colours and full dynamic range is not an option with Nvidia, but it is with AMD and Intel for some reason. That's why I connect the TV to the 11700's iGPU, even though the PC also has a 2070 in it.

With that said, the 2070 is the newest Nvidia card that I have. I don't know if this is still an issue with newer cards.
 
Last edited:
I do some basic 3D rendering here or there when I am bored or want to test something specific, or the odd time when I do mean to make something a bit more serious (it's a very odd time when I do), but other than that it's usually mostly gaming, but that's because none of my other workloads benefit from GPU compute.
 
Watching videos, doing CAD, looking at photos etc etc
 
I picked AI, because I generate images with Stable Diffusion. I also on occasion stream games from my desktop PC to my living room TV with Parsec which is encoding H.265.

Both work exceptionally well on my 6700 XT, despite the common narrative about AMD when it comes to AI and encoding. Until relatively recently I actually had an advantage over 3070 owners with Stable Diffusion because I have more VRAM.
 
Photo stacking and other image processing apps use my GPU while gaming is more of secondary use for me.
 
I was learning Android App Development and after that AI but gave up at about halfway point into the former. Now its just gaming and madVR. I do use my GPU for anything that permits it from time to time, be it photo editing, transcoding etc.
 
Sometimes I use AVC-Free for basic video editing. My craptop has, in addition to its Radeon Vega 8 IGP, a discrete GeForce GTX 1050M. It does a decent job of editing with AVC-Free as well. It's nowhere close to as fast as any desktop GPU but it's fast enough to be usable. :D

Of course, my RX 7900 XTX is wickedly fast but I only do basic video editing (mostly just cutting humourous clips out of videos or re-encoding video clips that are corrupt) once in a blue moon anyway.

For 99.99999% of the time, I'm strictly a gamer.


Yeah, I did the same thing with my RX 6800 XT. I threw it into a makeshift mining rig with my RX 5700 XT. It did pretty well.


The great thing about the world is that, sooner or later, someone will figure out how to do exactly that. Check these out:
I'm actually interested to try it now.
It's cool, but getting ai work on AMD is just painful
AMD should know better and lend some help/assistance to those developers whom making this programs.
 
It's cool, but getting ai work on AMD is just painful
AMD should know better and lend some help/assistance to those developers whom making this programs.
I completely agree. There's no reason why it should be so hard to do AI on a Radeon.
 
  • Like
Reactions: bug
What about personal projects?

I use my GPU for personal experiments and personal projects. As it turns out, having 5TFlops+ of compute is crazy good for all kinds of problems.
 
What about personal projects?

I use my GPU for personal experiments and personal projects. As it turns out, having 5TFlops+ of compute is crazy good for all kinds of problems.
Nah. My personal projects are deciding which game to buy next. I'm kinda disappointed with Elden Ring. I expected so much more from the 2022 "Game of the Year" than for it to just be what amounts to just another Dark Souls sequel. :laugh:
 
Nah. My personal projects are deciding which game to buy next. I'm kinda disappointed with Elden Ring. I expected so much more from the 2022 "Game of the Year" than for it to just be what amounts to just another Dark Souls sequel. :laugh:

Well, I get that people aren't into random-number generators or other theoretical comp. sci stuff. But I did do things like exhaustively check all 32-bit integers on GPUs (AMD Radeon 64) with custom code of mine. It turns out that 4096 shaders running at 1.5GHz can execute something like 6.144-Trillion instructions per second, and that the 32-bit integer space is just 4-billion integers.

IE: I could exhaustively check all 32-bit integers for the "most random constant" of my RNG in something like 1-second of GPU-compute. And it would have taken maybe 30+ minutes on my CPU.

That's maybe not "crazy cool" or whatever, but its something that took me about 2 or 3 hours to code up in ROCm and then a second or two to run. Meanwhile CPU code would have been much harder to run/debug when its 30-minutes each time to test the code.

---------

In the right environment, GPU-coding is actually very easy and possibly more convenient than CPU-programming. Especially if runtimes eek into the 1-minute to 10+ minute mark. Running code in 1-second instead of 10-minutes speeds up development.
 
What about personal projects?

I use my GPU for personal experiments and personal projects. As it turns out, having 5TFlops+ of compute is crazy good for all kinds of problems.
It depends on the project, I guess. Most compute libraries worth anything are CUDA-only, so not much choice there.
If you're just dabbling with compute for the first time, obviously you're going to use whatever GPU you already have. Even if that means it's harder to set up and it won't run as fast, if you have an AMD card, that's what you're going to use. Beyond that, you're probably going to want to go Nvidia.
 
It depends on the project, I guess. Most compute libraries worth anything are CUDA-only, so not much choice there.
If you're just dabbling with compute for the first time, obviously you're going to use whatever GPU you already have. Even if that means it's harder to set up and it won't run as fast, if you have an AMD card, that's what you're going to use. Beyond that, you're probably going to want to go Nvidia.

Honestly, I've looked through Thrust, CUB, TensorFlow, etc. etc.

CUB is the only useful library from NVidia for the experiments I listed earlier, and ROCm's hipCUB is an adequate replacement.

--------

I find it disheartening that so few people are experimenting with fundamental and foundational natures of SIMD compute here for GPUs. Well... somewhat disheartening. On the other hand, it makes it clear that I've found a niche that few other programmers understand, so there's that. Here's to job security I guess?

Not that any of my HPC experiments are actually part of anything professional. But I'm a lot more confident that I can interview / talk about low level / foundational parallelism issues in a way that few others are experimenting with these days. Don't get me wrong, I understand that Tensorflow and other such libraries are easier to use and get going. But there's so, so, so much more that these GPUs can be doing.
 
60% gaming majority - take notes Nvidia and stop taxing us with those high premiums
 
60% gaming majority - take notes Nvidia and stop taxing us with those high premiums
It's not Nvidia's fault, but the people who buys the cards a these prices

As a company, they just try to make as much money as possible, it's their purpose and their duty as a public company

Buy from AMD/Intel, buy second hand, or buy an APU
 
Buy from AMD/Intel, buy second hand, or buy an APU

Perhaps drop the APU and scrap gaming all-together and play ping pong against the wall in single player mode. Zero power consumption + unlimited FPS in real-time lol. Actually my last 2 buys are both USED... at incredibly lower prices acquired arduously looooong after launch but that can't be a praiseworthy norm going forward.

It's true companies aim to maximize profits (good on them) even if that means having to stir the pot by miserably falling short of the balance between profitability and ethical business practices. I have no problem with companies aiming for the stars but as a consumer, from an ethical POV, we reserve the right to share our sentiments, discontent or simply pointing out we don't like shoddy practices falling short of catering for the broader consumer market.

It's not always as simple as blaming a small group of trigger happy consumers for the wider consumer discontent or lack of affordability. The price setter has a pretty big hand in determining how things shape up. Public opinion, whether in the PRO/CON is always expected, encouraged, carries weight to spread awareness of "value" and on top naturally influences how companies approach pricing strategies. It's a natural process since the dawn of time and should not be disguised with unilaterally pointing fingers at a group of consumers who are willing to splurge. I've made some pretty nasty splurge-ups myself and still managed to voice opinions in the negative... perhaps hypocrisy or perhaps they've got us by the balls. Anyway these products are not of "necessity" (well for most of us) but let us not deny the cry babies... with each tear drop causing ripples, one too many turning to waves... eventually a kick in the backside TSUNAMI! (i admit a little hopeless at the mo but empires do cripple eventually).

lol can't get over the APU suggestion.. we'll just pretend you meant an APU paired with a dedicated card. I'm more in the high performance category with hi-res panels. you know the XX80-class(+)... essentially performance/quality goals where APUs are the death of the desired output.
 
To the "encode" crowd .. what do you encode? Do you transcode only or also more complex workflows in Davinci/Premiere?
 
Back
Top