• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Nvidia's Neural Texture Compression technology should alleviate VRAM concerns

Joined
Jan 29, 2012
Messages
6,881 (1.41/day)
Location
Florida
System Name natr0n-PC
Processor Ryzen 5950x-5600x | 9600k
Motherboard B450 AORUS M | Z390 UD
Cooling EK AIO 360 - 6 fan action | AIO
Memory Patriot - Viper Steel DDR4 (B-Die)(4x8GB) | Samsung DDR4 (4x8GB)
Video Card(s) EVGA 3070ti FTW
Storage Various
Display(s) Pixio PX279 Prime
Case Thermaltake Level 20 VT | Black bench
Audio Device(s) LOXJIE D10 + Kinter Amp + 6 Bookshelf Speakers Sony+JVC+Sony
Power Supply Super Flower Leadex III ARGB 80+ Gold 650W | EVGA 700 Gold
Software XP/7/8.1/10
Benchmark Scores http://valid.x86.fr/79kuh6
Those of us who feel like we got the shaft on vram this is something nice.

"There is some good news on the horizon thanks to Nvidia. It's working on a new compression technology it calls Neural Texture Compression, and like most of the tech coming out of Nvidia these days, it's thanks to AI."
 
What AI? They mean machine learning?

Per the other thread, firstly Vram requirements are just matching up with that the consoles have available.

Secondly, if you actually try these games on cards with lower amounts of vram, a few settings tweaks and you almost never have a problem to get a good gaming experience anyway.
 
Cool. I hope this turns out well.
 
  • Haha
Reactions: ixi
seems to convenient tbh, the timing is definitely sus, damage control maybe.
But I'm a firm believer this industry is lazy, and it's getting worst, it become all about throwing more power at it, instead of trying to optimise stuff.
 
Damage control.

This isn't going to be in any content for your current GPU regardless, so nice sound bite, but who cares.
At least you can clutch that warm feeling of Nvidia taking care of your future gaming tightly. Oooh so cozy! Look at them exercising leadership... Too bad it results in heavily overpriced product.
They will sell you this technology so you'll actually pay for dealing with their obnoxiously low VRAM in the future, this is the Nvidia version of the more you buy the more we save.
 
I'm sure people who are working on neural texture don't think about damage control or greedya. They think it's possible so they implement it.

What do you expect AI to do? It makes things that are not there. Pixels, frames, decisions, augmenting old with the generated new.
 
This feature wouldn't be necessary in the first place if some games didn't eat up exorbitant amounts of VRAM... I see Nvidia's implementation of this feature decreasing the effort put into texture optimization.
 
It's ironic. Nvidia is shipping their cards with ludicrously low amounts of VRAM with last and current gen, and now they try to convince users that that's enough with something that they're not even going to be able to use. After all, who uses RTX and DLSS, anyways?
It's also the game developers, who just REFUSE to optimize their games, especially with console-to-PC ports.
Console GPUs are AMD though, which may show something interesting...
 
First, Nvidia skimps on VRAM to make sure you always have to buy their latest and most expensive card. Now, you have another proprietary technology to lock you even deeper in the ecosystem. The audacity! :shadedshu:
 
Those of us who feel like we got the shaft on vram this is something nice.

"There is some good news on the horizon thanks to Nvidia. It's working on a new compression technology it calls Neural Texture Compression, and like most of the tech coming out of Nvidia these days, it's thanks to AI."

That article doesn't sound as rosy as you imply.

For one it starts out saying, "NTC can make use of general purpose GPU hardware and the Tensor cores of current gen Nvidia hardware", then ends by saying, "even if future RTX 50 or RTX 60 cards include NTC technology", which doesn't even make sense given they said it works even on current gen hardware.

It goes on to say, "decompression of our textures introduces only a modest timing overhead", and "possibly making our method practical in disk- and memory-constrained graphics applications". The author then speculates that, "it may be that there's a lot of work to be done before NTC becomes practical."

The author's bottom line is, "It's sure to be some time before we see this on GPUs, though. It'll take a long time to go from a white paper to retail."

Meanwhile, what are consumers to do but panic and buy cards with as much VRAM as they can afford, or wait and hope, perhaps quite a long time, while their current hardware becomes less and less viable. If there's one thing we've learned about Nvidia, it's that any of their new high tech advances come at quite a cost, and are usually only available or practical to use on their latest gen hardware. One way or another they find a way to appease the rich, and screw over anyone else.
 
Last edited:
Well don't buy entry level cards for mid range prices or mid range cards for mafia levels of extortion! But that would also need some research by the consumers first, not watching a gazillion YT influencers selling this $hit :slap:
 
Textures don't really use that much memory anymore in proportion to the rest of assets and other memory the engine might need to allocate, if you don't believe me fire up any recent game toggle between ultra and low textures and you'll see for yourself, it's often <2GB difference at most. Unless of course they are absurdly high resolution for some reason.

Shouldn't the AI futurist megaminds at Nvidia work on something else that's more useful, like using ML to dynamically generate textures at run time rather than try to squeeze some -5% in VRAM consumption ?
 
Textures don't really use that much memory anymore in proportion to the rest of assets and other memory the engine might need to allocate, if you don't believe me fire up any recent game toggle between ultra and low textures and you'll see for yourself, it's often <2GB difference at most. Unless of course they are absurdly high resolution for some reason.

Shouldn't the AI futurist megaminds at Nvidia work on something else that's more useful, like using ML to dynamically generate textures at run time rather than try to squeeze some -5% in VRAM consumption ?

That's not how things work at all, they don't have to keep them there for longer if they can stream them faster. And it's not like all the textures are in vram all the time., it depends on what you are doing on the game, some parts may be more demanding. So that is a bit of a pointless exercise. All of this vram issue is based on half asses conclusions and half backed assumptions, not just talking about this post.
 
Be happy with what good can come(from this) and not look for the bad.
 
Be happy with what good can come(from this) and not look for the bad.

Jensen at the next gpu reveal.... All our cards only come with 8GB now due to our groundbreaking AI only available on the 50 series :laugh: :laugh: :laugh:
 
Or instead they can just include more vram
 
Or instead they can just include more vram

na na na they rather spend millions developing a way to screw over gamers in the midrange even more lets be real.
 
Or instead they can just include more vram

definitely sus but the 2 things don't cancel out, that's like saying i don't need winrar, i can just get a bigger drive or a faster internet
 
  • Like
Reactions: Lei
"even if future RTX 50 or RTX 60 cards include NTC technology", which doesn't even make sense given they said it works even on current gen hardware.
they will have cores for path-tracing and will call it PTX6090, gone are the days for RTX :clap:

Well , Nvidia 5xxx Blackwell won't be earlier than 2025.
Excellent. And EA should make FIFA once in 4 years. What's up with peeing and releasing. hold on :shadedshu:

Be happy with what good can come(from this) and not look for the bad.
I always trust birds and manga ;)

definitely sus but the 2 things don't cancel out, that's like saying i don't need winrar, i can just get a bigger drive or a faster internet
Allergic to Jensen I am not. he's good. except for prices, a bit. and a bit more.
 
definitely sus but the 2 things don't cancel out, that's like saying i don't need winrar, i can just get a bigger drive or a faster internet
Obviously. Jesus you people like causing arguments.

The technology is cool and I hope they continue making it, but the THREAD context is how it can alleviate vram constraints. And it shouldn’t be the answer to that.
 
how it can alleviate vram constraints. And it shouldn’t be the answer to that.
it can. suppose they make a 5060 with 24g physical vram. If It can neurally show me textures 4 times larger, let it do.

I'm sure more vram would draw more power. so first find a way to keep things from flying over 500w, then talk about adding vram.
 
Obviously. Jesus you people like causing arguments.

The technology is cool and I hope they continue making it, but the THREAD context is how it can alleviate vram constraints. And it shouldn’t be the answer to that.

i just don't see it that way, i see it as suspicious, but it could indeed be the solution if it worked and it's not just damage control. I think it is, but i really can't say.
My point is i don't need more vram, i need a problem solved. This could actually be better then more vram on cards. More vram won't do anything for all the people that already got 8GB cards.
 
  • Like
Reactions: Lei
The technology is cool and I hope they continue making it, but the THREAD context is how it can alleviate vram constraints. And it shouldn’t be the answer to that.
Yep. Plus NVIDIA doing good things with AI optimized drivers and chip design too.
 
i just don't see it that way, i see it as suspicious, but it could indeed be the solution if it worked and it's not just damage control. I think it is, but i really can't say.
My point is i don't need more vram, i need a problem solved. This could actually be better then more vram on cards. More vram won't do anything for all the people that already got 8GB cards.
They already implement texture compression everyone has for decades. It’s cool they to use your analogy went from zip to rar but at some point maybe they should just use more vram like there competitors.

the answer isn’t always more hardware but I also think there are members in this thread who cling to the 6gb days and that colors there world view, it is 2023
 
Back
Top