• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Run AI on Your PC? NVIDIA GeForce Users Are Ahead of the Curve

Yeah, and this is how?
ROCm generally is used to get OpenCL working on linux. I don't think you need it to get OpenCL on windows, the standard drivers do that.

Other than that: This is as newsworthy as any press release from AMD or elsewhere. It is tagged as such you know.
Screenshot_20230924_153503.png


Saying a press release is full of PR spin is like saying water is wet.
 
I don't think Adobe got the memo about GPU's being able to accelerate it's generative fill AI. Adobe is still sending the request to the cloud and therby is able to monetise the process. You will get a limited number of usage credits per month and if you want more you'll need to shell out big dollars. It could all be done locally of course, but Adobe won't allow that.
 
So I can ask my outdated rtx to pretend I have 16gb vram ?

no

fml
 
Why do people moan about TPU showing press releases? I know this article is pure waffle from Nvidia, but I come to TPU for a roundup of news, including press releases, so I definitely do want to see it on TPU! How else will I know what the latest BS from companies is to know who to laugh at in discussions with people?
 
Fake prices
Fake resolution
Fake frames
Fake rays
What's the next "innovation"?
 
Artificial Neural Networks are over 100 years old: https://en.wikipedia.org/wiki/Artificial_neural_network#History
We had chat bots in 1966(ELIZA). We had ways to upscale, generate time-interlaced pixels and generate models that fits a classifier even 20 years ago. We used to have different words for different capabilities of our programs.

But every few years some companies come up with a new name and new publicity stunts to make it look like something new and the stupid people are catching the bait again.

We no longer write software: we "invent AI".
Every single hardware that is launched is no longer used by software, but by "AI". If you fail to mention AI you are doomed to not sell not even one piece.
AI now means everything and nothing.
AI has lost any practical meaning at this time: instead it is now a religion that billions of morons insist on believing in.

Artificial Intelligence is not just a trend, it is proof of natural stupidity.
 
Artificial Innovation, that's what I have been seeing.
Within two years the hardware you bought now won't be up to the task, because they're iterating software so fast, all for a reason, meanwhile to maintain pricing they cut production.

So you buy into they're tech and within two years they're telling you , that you need to spend 2k to game right.

I bought into it once, at least, that'll do.
That, right there is what callous monopoly is and it is happening already. Sadly.
 
Fake prices
Fake resolution
Fake frames
Fake rays
What's the next "innovation"?
Fake games! Oh wait, those already exist. :laugh:
 
Gaming aside for a sec:
Just as with CUDA, NVIDIA is ahead of AMD (and now also Intel..) with RT and AI capabilities, making it the Go-To GPU for just about any creator task.

Back to gaming:
In order to see a real chance in the current gaming-GPU market, AMD need to become a real alternative to NV in the creator market.

AMD has a foot hold there, but it just too little...
 
ROCm generally is used to get OpenCL working on linux. I don't think you need it to get OpenCL on windows, the standard drivers do that.

Other than that: This is as newsworthy as any press release from AMD or elsewhere. It is tagged as such you know.
View attachment 315072

Saying a press release is full of PR spin is like saying water is wet.
These press releases are fine, as long as we get to shit all over them in the comments honestly. Tit for tat :D
 
Nvidia has way to much money, they flooded the internet with positive news. I can't read a technical news anymore without being about something they did.
 
sounds like brainwashing propaganda.

neophyte reading this pamphlet can easily fall to think that's the next stuff to get. pure illusion.

my 100% usage :
- turn on PC
- launch game in native resolution
- Ultra/High settings
- no AI, no RT, no DLSS/FSR
- enjoy gaming several hours
- turn off PC

just buy a card that performs better in rasterization native resolution without all the fancy gimmicks that you will never use.
when you need to turn on DLSS/FSR to reach decent FPS then you know it is time to consider upgrading your GPU
 
when you need to turn on DLSS/FSR to reach decent FPS then you know it is time to consider upgrading your GPU
This is the problem though, if it's been decided for us that the playable performance bar is set with DLSS/FSR on, then you need every generations 4090 equivalent in order to run things natively. I guess it's win:win for Nvidia, you're either buying a 4090 because you need the raw raster performance to play games at the native settings you want, or you need it to play with RTX & DLSS. You gotta admire the strategy, even if you hate it.
 
This is the problem though, if it's been decided for us that the playable performance bar is set with DLSS/FSR on, then you need every generations 4090 equivalent in order to run things natively. I guess it's win:win for Nvidia, you're either buying a 4090 because you need the raw raster performance to play games at the native settings you want, or you need it to play with RTX & DLSS. You gotta admire the strategy, even if you hate it.
Or you could use a lower native resolution and settings. Low-mid range cards have never been super capable of the highest resolutions at max settings. It's always been a compromise.
 
Or you could use a lower native resolution and settings. Low-mid range cards have never been super capable of the highest resolutions at max settings. It's always been a compromise.
I guess the point is the amount you need to spend on a GPU to get 'medium' settings at native is going up, fast.

Edit: maybe Starfield is an exceptional case, but without FSR you need a 3080 to hit 60fps at 1080p on medium settings! A 3080 for 1080p medium!
 
Last edited:
I guess the point is the amount you need to spend on a GPU to get 'medium' settings at native is going up, fast.

Edit: maybe Starfield is an exceptional case, but without FSR you need a 3080 to hit 60fps at 1080p on medium settings! A 3080 for 1080p medium!
I consider it an anomaly, like Crysis.
 
This is the problem though, if it's been decided for us that the playable performance bar is set with DLSS/FSR on, then you need every generations 4090 equivalent in order to run things natively. I guess it's win:win for Nvidia, you're either buying a 4090 because you need the raw raster performance to play games at the native settings you want, or you need it to play with RTX & DLSS. You gotta admire the strategy, even if you hate it.

I do admire DLSS/FSR, I think it's great that those technologies are available to us for "free" thanks to the great work of engineers working at NVidia and AMD (and intel too)modders and game developers , honnestly it is so great to bring this tuning flexibility to users.

I see it more like a benefit to extend the life of you GPU for 1 or 2 more years. which is great, it gives you more time to wait the next gen GPU or even skip to the next after.

I haven't been blowed by the greedy RT effects in any game yet, so I don't need RT so I don't need DLSS/FSR/RT alway on.
maybe in few years, we will see.
by that time we will have DLSS 10, and games will just run natively from our phones and it will be the end of GPU era, who knows :)
 
Impressive.
If you look closely, that's the most realistic graphics ever produced by an engine. Illumination was almost perfect, car details the same, everything is outstanding.
However it seems that the computation power is not yet there, but this can be tweaked.
Looking forward în the future for this.
Amazing tech and times.
 
Impressive.
If you look closely, that's the most realistic graphics ever produced by an engine. Illumination was almost perfect, car details the same, everything is outstanding.
However it seems that the computation power is not yet there, but this can be tweaked.
Looking forward în the future for this.
Amazing tech and times.
Except that it's not "graphics produced by an engine", but elements of a video (or several videos) extrapolated onto a graphics engine to make it work real-time.
 
Except that it's not "graphics produced by an engine", but elements of a video (or several videos) extrapolated onto a graphics engine to make it work real-time.
He mentioned using UE4 for this to generate the graphics...
 
Back
Top