• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Blackwell RTX and AI Features Leaked by Inno3D

Cpt.Jank

Staff
Staff member
Joined
Aug 30, 2024
Messages
183 (0.57/day)
NVIDIA's RTX 5000 series GPU hardware has been leaked repeatedly in the weeks and months leading up to CES 2025, with previous leaks tipping significant updates for the RTX 5070 Ti in the VRAM department. Now, Inno3D is apparently hinting that the RTX 5000 series will also introduce updated machine learning and AI tools to NVIDIA's GPU line-up. An official CES 2025 teaser published by Inno3D, titled "Inno3D At CES 2025, See You In Las Vegas!" makes mention of potential updates to NVIDIA's AI acceleration suite for both gaming and productivity.

The Inno3D teaser specifically points out "Advanced DLSS Technology," "Enhanced Ray Tracing" with new RT cores, "better integration of AI in gaming and content creation," "AI-Enhanced Power Efficiency," AI-powered upscaling tech for content creators, and optimizations for generative AI tasks. All of this sounds like it builds off of previous NVIDIA technology, like RTX Video Super Resolution, although the mention of content creation suggests that it will be more capable than previous efforts, which were seemingly mostly consumer-focussed. Of course, improved RT cores in the new RTX 5000 GPUs is also expected, although it will seemingly be the first time NVIDIA will use AI to enhance power draw, suggesting that the CES announcement will come with new features for the NVIDIA App. The real standout feature, though, are called "Neural Rendering" and "Advanced DLSS," both of which are new nomenclatures. Of course, Advanced DLSS may simply be Inno3D marketing copy, but Neural Rendering suggests that NVIDIA will "Revolutionize how graphics are processed and displayed," which is about as vague as one could be.



Just based on the information Inno3D has revealed, we can speculate that there will be a new DLSS technology, perhaps DLSS 4. As for Neural Rendering, NVIDIA has a page detailing research it has done relating to new methods of AI-generated textures, shading, and lighting, although it's unclear which of these new methods—which seem like they will also need to be added to games on the developer side—it will implement. Whatever it is, though, NVIDIA will likely divulge the details when it reveals its new 5000 series GPUs.

View at TechPowerUp Main Site | Source
 
Is NVIDIA trying to give me buyer's remorse for having just bought an RTX 4070 SUPER? :laugh:

Regardless, this article is right about Inno3D's taglines being able to mean anything. Marketing is marketing for a reason, after all. Improved AI features also aren't surprising, though most of those probably won't benefit the average consumer much and will be targeted towards the enterprise sectors that would've bought an RTX 5090 anyways.
 
wow wow wow nothing new really
 
This is a practice I absolutely cannot stand with Nvidia. I really wish AMD had a true competitor at the 5090 level, but I'm stuck with buying a $2000 GPU next year (I hope it doesn't cost more than that).
 
Low quality post by sepheronx
This is a practice I absolutely cannot stand with Nvidia. I really wish AMD had a true competitor at the 5090 level, but I'm stuck with buying a $2000 GPU next year (I hope it doesn't cost more than that).
Sounds to me you have more money than sense.
 
Low quality post by Guwapo77
Sounds to me you have more money than sense.
You need to insult for what reason huh? 4K gaming @ 240Hz with maxed settings - options?
 
Awesome. This is relentless innovation. I was a little worried that nVidia would completely forget about gaming as long as 90%+ of their revenue is coming from AI/DC but looks like that worry was unfounded. They still keep trucking on the feature set front.

I can't wait to see what they have cooked up for the RTX 5000 series and I will grab a RTX 5090 as soon as the initial onslaught has died down. I made the (minor) mistake of being a too early adopter of the RTX 4090 but no one really knew if we'd see a repeat of the CoVid and crypto scalping craze when those cards were released. I bought mine for ~€2400 and three months later the cards were consistently sold for under €2000.

This time I will wait two or three months until prices and availability have normalized. Then that sweet RTX 5090 ass is mine! :D
 
Absolutely nothing new, the only new thing here are the names.

"AI-Enhanced power efficiency", what are you on about nvidia? You would rather sell smoke than actually push for a lower TDP?
 
You need to insult for what reason huh? 4K gaming @ 240Hz with maxed settings - options?
nah, just the simple fact that there is no such thing as 4k gaming @ 240Hz. I seem to recall 4090 was said to be just that and guess what? it isn't. Funny thing is, it needs make belief frames in order to even operate at playable levels for a ton of games so it doesn't even run at 4K settings.

You are chasing a ghost. Well, you and everyone like you. And its funny because jacket man can look at people like you as an easy cash grab because you will fall for such marketing.

Or you mean 4K at 240hz playing terraria or stardew valley.
 
Neural Rendering is interesting. I have to wonder if Path Traced lighting could be cheated in at a lower resource cost by having an AI generate just the lighting instead of actually simulating the light rays. It would require a very slim and focused generative AI model as current high quality models are limited to 1024x1024 and use 24GB of VRAM but those are general purpose models. Still, the model size would have to be vastly smaller in order to run as just the lighting step.

I also have to wonder how additional AI will impact VRAM usage. The AI is going to have an overhead but it's also feasible that if the GPU doesn't need to store as much lighting data in the VRAM it could reduce VRAM usage. Then again the AI might need that data and simply be additive to VRAM consumption.

Even though I enjoy tinkering with AI a lot, I feel like it should be pointed out that AI has downsides. In the case of generating assets that's typically a reduction in quality , heavy limits to resolution, added visual artifacts, concept bleed, and noticeable patterns to output content. In regards to the latter, you can often start to notice patterns in content generated by AI even when the input is different. It gets worse when you start feeding AI generated content into AI as those downsides tend to stack.

Awesome. This is relentless innovation. I was a little worried that nVidia would completely forget about gaming as long as 90%+ of their revenue is coming from AI/DC but looks like that worry was unfounded. They still keep trucking on the feature set front.

Nvidia occupies about 13% of Nvidia's mind, which is about the proportion of sales it represents to them.

Make no mistake, it's pushing AI improvements for it's enterprise customers and not gamers.
 
Low quality post by Visible Noise
Low quality post by sepheronx
"AI-Enhanced Power Efficiency" lol, what does that even mean.
 
More meaningless marketing wank.
woohoo :rolleyes:
 
Low quality post by Visible Noise
Easy, people like him and possibly you are the reason for high prices for gpus. Cause "you have no choice"

Do you complain about people buying eggs too? How about cars, are people that like expensive cars stupid also?

Judging someone on how they choose to use their disposable income is one of the absolute stupidest things I see on the internet.

Do I get to judge you for living in Detroit’s armpit?
 
Last edited:
Curious to see the new features detailed beyond these uhh.. buzzwords.
 
Easy, people like him and possibly you are the reason for high prices for gpus. Cause "you have no choice"
No no, he's buying the card for $500 while giving Nvidia a $1500 R&D grant to develop better economies of scale. It's for the good of all.

Anyway, it's not a healthy mindset to point one's finger at whoever's closest by, and blame them for the things that aren't right in the world. Let's enjoy each other's presence here! :peace:
 
Make no mistake, it's pushing AI improvements for it's enterprise customers and not gamers.

* its

... and I'm not making that mistake. It still takes "translation" to the consumer market and a lot of work on the software and feature set side to adapt it for the consumer market. That remains impressive, no matter how much people want to downplay it.

I mean, nVidia could be a complacent and lazy company like AMD :p and only release cards that are 10% faster every generation but they actually choose to keep pushing the envelope, in spite of the fact that less than 10% of their business is gaming these days.

I was fully expecting them to take a break with regard to gaming because, given the extreme boost in revenue, it would have only made sense to shift all of their engineers to work on AI/DC stuff but here we are with all new features on the consumer RTX 5000 front. That is commendable and I'm really looking forward to that RTX 5090 masterpiece. Bring it on, Jen-Hsun! Let's do this shit! :D
 
I currently have an RTX 2060, for 4K I wanted to upgrade finally to a 5080/5090 depending on price.


But I have so many remarks:
  • 24GB was more than plenty, those 32GB are here for AI first and foremost...whilst the 5080 doesn't have enough with 16GB, that's really is an absurd product segmentation.
    • I though Quadro was there for workstations...gaming segment should still be focused on gaming...
  • Why still 5nm TSMC nodes ? (4NP is 5nm not 4nm), if those 5090 are really rumoured to be 2,500 EUR, they should have picked the more efficient nodes.
  • AI like DLSS is great but I do not want my GPU to "guess" the texture and physics (besides for upscaling them), I still want actual developers and creative people to design the games and its assets, so that "neural rendering" I hope it's some marketing gimmicks
  • AI power efficiency ? lol, come on, if they cared so much about efficiency they'd use 2 or 3 nm nodes alongside software optimisations.
  • AI thermal management ? unless it means smart undervolting it's also pure gimmick imho
 
  • 24GB was more than plenty, those 32GB are here for AI first and foremost

Depends. If you play something like Microsoft Flight Simulator then it is pretty easy to push it past 24GB VRAM usage if you install a nice, juicy 8K texture pack for e.g. the FBW Airbus A380. Been there, done that.

  • Why still 5nm TSMC nodes ? (4NP is 5nm not 4nm), if those 5090 are really rumoured to be 2,500 EUR, they should have picked the more efficient nodes.

Same reason AMD still went with 5nm ("4nm") for Zen 5. They both need all their 3nm capacities for AI/DC.
  • AI power efficiency ? lol, come on, if they cared so much about efficiency they'd use 2 or 3 nm nodes alongside software optimisations.

2nm is nowhere near ready yet. 3nm is ready but it is extremely expensive and both, nVidia and AMD, are wise to only use it for AI/DC for now. A 3nm RTX 5090 would likely really carry a price tag of $2,999 (or more).
 
I really wish AMD had a true competitor at the 5090 level
Just so Ngreedia didn’t charge you as much as they do.

People like you will never buy an AMD gpu.

Well, maybe if they released a gpu that its faster than the 5090 and cost 500, mayyybe some of you will consider it.

Anyways, sounds to me that they will pull the same shenanigans they pulled with dlss and the 40 series.
 
Do you complain about people buying eggs too? How about cars, are people that like expensive cars stupid also?

Judging someone on how they choose to use their disposable income is one of the absolute stupidest things I see on the internet.

Do I get to judge you for living in Detroit’s armpit?
Of course.

When the market is screwing you, you try to protest the prices. Most countries do. I find everyone here too lazy and complacent. I actually do complain about people who buy certain cars for exact same reason.
 
This is a practice I absolutely cannot stand with Nvidia. I really wish AMD had a true competitor at the 5090 level, but I'm stuck with buying a $2000 GPU next year (I hope it doesn't cost more than that).

No one is forcing you to shell out that kind of money for a GPU. Your purchase is an endorsement of their pricing. If I'm Nvidia looking at this post I'd wager you seem to be rather cavalier about $2,000 so might as well make it $2,300. You are going to buy anyways.
 
I can honestly think of way better things to spend 2K on, I am priced out of that market for sure. I don't care who's name is on the box, not worth 2K.

The worst part is, my 2K is not your 2K lol, its even lower.

But our government just imploded, so there is that.
 
Back
Top