• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Designs Neural Block Compression Tech for Games: Smaller Downloads and Updates

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,668 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
AMD is developing a new technology that promises to significantly reduce the size on disk of games, as well as reduce the size of game patches and updates. Today's AAA games tend to be over a 100 GB in size, with game updates running into tens of gigabytes, with some of the major updates practically downloading the game all over again. Upcoming games like Call of Duty: Black Ops 6 is reportedly over 300 GB in size, which pushes the game away from those with anything but Internet connections with hundreds of Mbps in speeds. Much of the bulk of the game is made up of visual assets—textures, sprites, and cutscene videos. A modern AAA title could have hundreds of thousands of individual game assets, and sometimes even redundant sets of textures for different image quality settings.

AMD's solution to this problem is the Neural Block Compression technology. The company will get into the nuts and bolts of the tech in its presentation at the 2024 Eurographics Symposium on Rendering (July 3-5), but we have a vague idea of what it could be. Modern games don't drape surfaces of a wireframe with a texture, but also additional layers, such as specular maps, normal maps, roughness maps, etc). AMD's idea is to "flatten" all these layers, including the base texture, into a single asset format, which the game engine could disaggregate into the individual layers using an AI neural network. This is not to be confused with mega-textures—something entirely different, which relies on a single large texture covering all objects in a scene. The idea here is to flatten the various data layers of individual textures and their maps, into a single asset type. In theory, this should yield significant file-size savings, even if it results in some additional compute cost on the client's end.



View at TechPowerUp Main Site | Source
 
I suspect the asset download size is not the actual end goal, but a bonus. I don't think there's much stopping people from shipping generative models that actually generated the assets themselves with specified seeds and post-processing, or even dynamic textures by this point.

Years ago there is a FPS demo well under 100KB, presumably with mathematically described meshes and textures. This might ultimately be a fancier version of that with generative AI stapled on.

Interesting in this case, how much performance and time will be spent compressing/decompressing the assets? One of the reasons why game levels don't load in literally 2-3 seconds is decompression operations.
It probably need not be synchronous. Unpack at install time for commonly-used assets and just-in-time generation for others, maybe.
 
Interesting in this case, how much performance and time will be spent compressing/decompressing the assets? One of the reasons why game levels don't load in literally 2-3 seconds is decompression operations.
Hopefully, it'll use the otherwise useless AI cores, which means the CPU and other parts of the GPU will be relatively load-free and ready to do other tasks. Hopefully.
 
Years ago there is a FPS demo well under 100KB, presumably with mathematically described meshes and textures. This might ultimately be a fancier version of that with generative AI stapled on.
96kB to be precise, the game is called .kkrieger. I still have it somewhere in my harddrive. Its impressive as it uses DirectX 9 IIRC and looks impressive with pixel shader effects and lighting for sub 100kB game
 
Massive downloads are another sign of a shit developer that isn't being intelligent with asset management IMO.

Things that irk me in particular are cinematics that are clearly just in-engine footage encoded to 1080p video. If you're playing at 1440p or 4K ultra, or at higher framerate than the video, the cutscenes are lower resolution AND lower framerate - so they're jarringly worse than your regular gameplay all while taking up an absolute gobload of disk space as they pull you out of the immersive in-engine experience you were enjoying before the cutscene kicked in.

Patches that redownload the whole thing rather than just a delta patch are also stupid. If a 10GB file has changed, don't download the whole 10GB, download the 400Kb that's been updated from the original. Bit-level replication is ancient technology at this point so why aren't game devs using it?
 
Massive downloads are another sign of a shit developer that isn't being intelligent with asset management IMO.

Things that irk me in particular are cinematics that are clearly just in-engine footage encoded to 1080p video. If you're playing at 1440p or 4K ultra, or at higher framerate than the video, the cutscenes are lower resolution AND lower framerate - so they're jarringly worse than your regular gameplay all while taking up an absolute gobload of disk space as they pull you out of the immersive in-engine experience you were enjoying before the cutscene kicked in.

Patches that redownload the whole thing rather than just a delta patch are also stupid. If a 10GB file has changed, don't download the whole 10GB, download the 400Kb that's been updated from the original. Bit-level replication is ancient technology at this point so why aren't game devs using it?

Cut scenes can be good and bad honestly. After you've seen then once having to skip them becomes as hassle. I wish game engines would have check box to turn them off automatically after you've watched them that you can toggle in options rather than press spare once cut scene begins every time you come across the same cut scene for the 100th time.
 
Cut scenes can be good and bad honestly. After you've seen then once having to skip them becomes as hassle. I wish game engines would have check box to turn them off automatically after you've watched them that you can toggle in options rather than press spare once cut scene begins every time you come across the same cut scene for the 100th time.
I like cutscenes, but not when the whole game is a massive cutscene with the occasional "press X button" message imprinted on it. If I want to watch a movie, I'll do that.

I miss live-action cutscenes of old ages. I'm glad Alan Wake 2 brought them back. :)
 
Massive downloads are another sign of a shit developer that isn't being intelligent with asset management IMO.

Things that irk me in particular are cinematics that are clearly just in-engine footage encoded to 1080p video. If you're playing at 1440p or 4K ultra, or at higher framerate than the video, the cutscenes are lower resolution AND lower framerate - so they're jarringly worse than your regular gameplay all while taking up an absolute gobload of disk space as they pull you out of the immersive in-engine experience you were enjoying before the cutscene kicked in.

Patches that redownload the whole thing rather than just a delta patch are also stupid. If a 10GB file has changed, don't download the whole 10GB, download the 400Kb that's been updated from the original. Bit-level replication is ancient technology at this point so why aren't game devs using it?

Whether we like it or not the era of polished AAA games are behind us. It costs too much to make because all the small details are a time sink.

Now we have unoptimized games enhanced with AI so that they run correctly with lower dev costs.
 
Whether we like it or not the era of polished AAA games are behind us. It costs too much to make because all the small details are a time sink.

Now we have unoptimized games enhanced with AI so that they run correctly with lower dev costs.
If with our hardware, we are going to make games from scratch, or literally from scratch, then never again a penny for the game studios. They become redundant. They will have to look for another job, or just be fed by social services.
 
If with our hardware, we are going to make games from scratch, or literally from scratch, then never again a penny for the game studios. They become redundant. They will have to look for another job, or just be fed by social services.
Agreed. And at least I won't have to upgrade my PC in the next decade, because all the old games I'm interested in run on it just fine.
 
Smells like yet another problem invented so that "AI" can solve it.
 
How in the hell is Call of Duty: Black Ops 6 over 300 GB in size :wtf:

Tons of AAA games don't even come close to that and even the biggest ones tend to be around 150GB or even half that. COD is getting way out of control, if that game gets much bigger in the future you'll need a dedicated SSD just for it.
 
How in the hell is Call of Duty: Black Ops 6 over 300 GB in size :wtf:

Tons of AAA games don't even come close to that and even the biggest ones tend to be around 150GB or even half that. COD is getting way out of control, if that game gets much bigger in the future you'll need a dedicated SSD just for it.
It's not. The 300GB install size included multiple previous entries. Apparently btarunr didn't get the memo that the initial twitter outrage was over a misunderstanding (as usual)
 
How narrative has changed. :rolleyes:

 
How narrative has changed. :rolleyes:


Don't worry, this time the tech is coming from AMD so it's valid, cool, innovative, exciting, acceptable, positive, etc.
 
How narrative has changed. :rolleyes:


Don't worry, this time the tech is coming from AMD so it's valid, cool, innovative, exciting, acceptable, positive, etc.

I'm not seeing any AMD flag waving here. I think you're trolling a little too hard.
 
I'm not seeing any AMD flag waving here. I think you're trolling a little too hard.

Perhaps, but I took a look in that thread and it really seemed negative when they announced something similar. I'm glad that it's no longer seen as "damage control", though. It's genuinely cool tech.
 
Perhaps, but I took a look in that thread and it really seemed negative when they announced something similar. I'm glad that it's no longer seen as "damage control", though. It's genuinely cool tech.

Probably because it was seen as an excuse for Nvidia's approach at bifurcating the product stack along VRAM lines. In that scenario, it was seen as another software solution for a hardware deficiency (right or wrong). This isn't about a hardware limit on the GPU side, and it has no bearing on AMD's product.

Technically very different things.
 
Probably because it was seen as an excuse for Nvidia's approach at bifurcating the product stack along VRAM lines. In that scenario, it was seen as another software solution for a hardware deficiency (right or wrong). This isn't about a hardware limit on the GPU side, and it has no bearing on AMD's product.

Technically very different things.

Fair enough. I suppose it's also a matter of being almost a year ahead. Parallels between them are definitely there, both technologies seek to optimize storage, and seem to target a similar thing, I just found this more detailed explanation and the research paper on it:


Maths behind it are insane, man :eek:
 
Parallels between them are definitely there, both technologies seek to optimize storage

Not really, they're going after very different applications. You could say nvidia is looking to optimize frame buffer storage but that's a huge strech, the entire news article is about saving vram space and they're very vague about the performance hit on rendering ("it exists, but don't worry about it just thrust me bro"). AMD on the other hand is instead looking at the base files on disk, something that's none of their business to be honest - should be on game developers and game engines - but if no one else is going to bother even with games going nuts requiring 100, 200 or even 300gb good on them for doing something about it.

Something more concerning is how reconstruction techniques are being named as compression - both by nvidia and amd. They can be great and all but they're not compression, the neural network is generating new information, it can be very close to the original uncompressed data but it won't be the same thing. A better term would be neural network based reconstruction
 
Smells like yet another problem invented so that "AI" can solve it.

More like "solving" the existing problem the wrong way and compounding it instead.

DL sizes shouldn't be this big to begin with. They're packing 100GB+ of crap into half-baked releases, now they will have excuse to pack 120-150+ GB of crap into half-baked releases with no improvement in actual quality or more meaningful content, just rushed, bloated crap, because "muh kompreshun!1!!1!!!!!!"
 
Back
Top