Monday, January 9th 2023

NVIDIA Could Release AI-Optimized Drivers, Improving Overall Performance

NVIDIA is using artificial intelligence to design and develop parts of its chip designs, as we have seen in the past, making optimization much more efficient. However, today we have a new rumor that NVIDIA will use AI to optimize its driver performance to reach grounds that the human workforce can not. According to CapFrameX, NVIDIA is allegedly preparing special drivers with optimizations done by AI algorithms. As the source claims, the average improvement will yield a 10% performance increase with up to 30% in best-case scenarios. Presumably, AI can do optimization on two fronts: shader compiles / game optimization side or for power management, which includes clocks, voltages, and boost frequency curves.

It still needs to be made clear which aspect will company's AI optimize and work on; however, it can be a combination of two, given the expected drastic improvement in performance. Special tuning of code for more efficient execution and a better power/frequency curve will bring the efficiency level one notch above current releases. We have already seen AI solve these problems last year with the PrefixML model that compacted circuit design by 25%. We need to find out which cards NVIDIA plans to target, and we can only assume that the latest-generation GeForce RTX 40 series will be the goal if the project is made public in Q1 of this year.
Source: CapFrameX
Add your own comment

37 Comments on NVIDIA Could Release AI-Optimized Drivers, Improving Overall Performance

#1
bug
If the AI can spot optimizations that we can't, why not harness that, I guess?

This can work in reverse, too: by observing what optimizations the AI comes up with, we can learn to pay attention to things we don't right now.
Posted on Reply
#3
P4-630
WOuld be nice if all RTX GPU's from 2xxx would be supported..
Posted on Reply
#4
nguyen
Skynet sure working its way toward world domination :rolleyes:
Posted on Reply
#5
N/A
They have to optimise for redundantly repetitive code. The driver is nearly 2GB already. My fear is they could enable frame generation all the time in all the games by default.
Posted on Reply
#6
birdie
What an embarrassingly stupid poll.

AI has been used by multiple chip designers for many years now, it's just they find new ways to apply it.

I'm totally OK with AI optimizing drivers as long as those optimizations are transparent, understood and approved by human beings.

The biggest issue with many AI models nowadays is that they produce basically black boxes where you don't quite understand what and how they do which has a non-zero chance of leading to disastrous results when the input data is outside the training set.
Posted on Reply
#7
neatfeatguy
P4-630WOuld be nice if all RTX GPU's from 2xxx would be supported..
And give people a reason not to upgrade to the Ada series?

That doesn't sound like something Nvidia would want.
Posted on Reply
#8
dgianstefani
TPU Proofreader
neatfeatguyAnd give people a reason not to upgrade to the Ada series?

That doesn't sound like something Nvidia would want.
You realise that Ada series would also benefit from the performance increases, so the difference between the generations would likely remain the same...
Posted on Reply
#9
tfdsaf
I doubt AI can do much on the software side, maybe slight improvements like 2-3% by eliminating bloated code and stuff, but I think the biggest thing will be in actually improving actual GPU core design. I'm fairly confident it can find more optimal solutions, though quite a few companies have used AI to help them develop various products!
Posted on Reply
#10
neatfeatguy
dgianstefaniYou realise that Ada series would also benefit from the performance increases, so the difference between the generations would likely remain the same...
Nvidia isn't going to give something away for free. They'll come up with something about how it's only available for their 4xxx series because of some stupid reason. I don't have faith in Nvidia.
Posted on Reply
#11
bug
tfdsafI doubt AI can do much on the software side, maybe slight improvements like 2-3% by eliminating bloated code and stuff, but I think the biggest thing will be in actually improving actual GPU core design. I'm fairly confident it can find more optimal solutions, though quite a few companies have used AI to help them develop various products!
Getting rid of software overhead and implementing multithreading in a safe manner are no small feats. If the AI can take care of only half of that, its use would still be a big win.
Posted on Reply
#12
Punkenjoy
P4-630WOuld be nice if all RTX GPU's from 2xxx would be supported..
Technically, it should be architecture agnostic. They are running that on their servers, not on your GPU. It's just a matter of knowing if they have the budget and the will to support Turing for a long time. In theory, it could be cheaper to do it that way.
tfdsafI doubt AI can do much on the software side, maybe slight improvements like 2-3% by eliminating bloated code and stuff, but I think the biggest thing will be in actually improving actual GPU core design. I'm fairly confident it can find more optimal solutions, though quite a few companies have used AI to help them develop various products!
Well, unlike most code generated by machine learning, this one is really focused on an area. I also think they can train it by actually running the game and getting the results to train the AI. With a very large farm, you could do many iteration in a short time frame. Also, that doesn't prevent human from teaching optimized code.

In my opinion, this wouldn't lead to major upgrade of performance in commonly tested games that already are pretty optimized. But that could allow to generate optimized drivers for way more games than there is currently and this is probably where the gain will be much bigger.

In the end, it's mostly always optimizing shaders for a specific architecture that do not change. It's a bit more precise than just generating code for a broad application.
Posted on Reply
#13
ZoneDymo
Imagine if the AI told them the prices were too high
Posted on Reply
#14
wNotyarD
ZoneDymoImagine if the AI told them the prices were too high
NVIDIA would shut it down.
Posted on Reply
#15
zlobby
Everybody software developer until ML start doing it better and for free. :D

Well, AI may decide that the ultimate performance can be obtained via removing all brakes on thermal and power and boosting clocks to the sky. Or that color is overrated and revet back to CGA color space.

I can see the memes already! Especially cool would be the Chernobyl-based ones. 'And that is how a 4090 explodes.' Or 'Why are we the only company that uses negative OC voltage offset, graphite moderated vapor coolers and AI drivers? Because it's cheaper!'
Posted on Reply
#16
D3ssel
neatfeatguyNvidia isn't going to give something away for free. They'll come up with something about how it's only available for their 4xxx series because of some stupid reason. I don't have faith in Nvidia.
I would be very surprised if it doesn't.

Driver optimizations aren't like technologies such as Frame Generation, etc.

It's really hard to "lock it" to a specific card generation.

But it will surely be custom tailored for them, so the gains on other generations will probably be lower, at least. Possibly much lower.
Posted on Reply
#17
DeathtoGnomes
bugIf the AI can spot optimizations that we can't, why not harness that, I guess?

This can work in reverse, too: by observing what optimizations the AI comes up with, we can learn to pay attention to things we don't right now.
That would be too simple to do, I mean, who really learns by mistakes? :rolleyes:
Posted on Reply
#18
bug
zlobbyEverybody software developer until ML start doing it better and for free. :D
Truth be told, writing software is every bit as menial and error prone as working in a factory was, before Henry Ford. I would be really surprised in AI doesn't take over a good chunk of it.
zlobbyWell, AI may decide that the ultimate performance can be obtained via removing all brakes on thermal and power and boosting clocks to the sky. Or that color is overrated and revet back to CGA color space.

I can see the memes already! Especially cool would be the Chernobyl-based ones. 'And that is how a 4090 explodes.' Or 'Why are we the only company that uses negative OC voltage offset, graphite moderated vapor coolers and AI drivers? Because it's cheaper!'
That would be a prime example of "garbage in, garbage out". I deal with this almost every day :(
Posted on Reply
#19
Slizzo
Linus and Luke have already talked about code generation using AI chat bots on the WAN show a couple times. I agree with them. Letting the bots do certain things as a starting point or to clean up already written code is a very positive thing. It will allow dev teams to be more efficient.
Posted on Reply
#20
natr0n
It's kinda weird code optimizing code/itself basically.
Posted on Reply
#21
hsew
SlizzoLinus and Luke have already talked about code generation using AI chat bots on the WAN show a couple times. I agree with them. Letting the bots do certain things as a starting point or to clean up already written code is a very positive thing. It will allow dev teams to be more efficient.
Those still have a long way to go. Try asking ChatGPT for anything written in Pine Script.
Posted on Reply
#22
bug
SlizzoLinus and Luke have already talked about code generation using AI chat bots on the WAN show a couple times. I agree with them. Letting the bots do certain things as a starting point or to clean up already written code is a very positive thing. It will allow dev teams to be more efficient.
Eh, a number of years ago (before Heartbleed) there was this issue with Debian (and derivatives) where a tool detected some memory has not been correctly freed and fixed it. What ensued was a number of years when SSL keys generated on Debian were always the same 256 variants, because that memory area was used for building entropy and the automated tool had no way of knowing this. And neither did the maintainer that reviewed the change.

Usage of these tools can be a very positive things. But they're still tools that need to be handled with care.
Posted on Reply
#23
delshay
I can see a lot of job losses across the world.
Posted on Reply
#24
zlobby
delshayI can see a lot of job losses across the world.
For better or worse, yes.
Posted on Reply
#25
StefanM
So does it mean that the NGX SDK - which is labeled "early access" since 4 years (!) - will be released?

NGX Architecture

  • NGX software: The RTX feature is integrated into any game, application or plugin through the use of the NGX SDK. This is the main code for the AI-accelerated functionality, but it also requires a trained neural network in order to function. NGX has been designed to be thin - a header file that points to a DLL in the NVIDIA driver, making it simple to augment any application with these RTX features.The NGX SDK provides access to a number of RTX technologies and features for games and digital content creation and editing applications.The capabilities of NGX are tightly coupled to the NVIDIA driver and hardware and make use of Tensor Cores found in RTX-capable GPUs.
  • NGX features: The trained deep neural networks (DNNs) are included in the driver download. These neural networks are updatable as they get more accurate and powerful allowing the performance and capabilities of these features to evolve over time.
Posted on Reply
Add your own comment
Apr 23rd, 2024 09:22 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts