• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Meteor Lake Can Play Videos Without a GPU, Thanks to the new Standalone Media Unit

AleksandarK

News Editor
Staff member
Joined
Aug 19, 2017
Messages
2,999 (1.07/day)
Intel's upcoming Meteor Lake (MTL) processor is set to deliver a wide range of exciting solutions, with the first being the Intel 4 manufacturing node. However, today we have some interesting Linux kernel patches that indicate that Meteor Lake will have a dedicated "Standalone Media" Graphics Technology (GT) block to process video/audio. Moving encoding and decoding off GPU to a dedicated media engine will allow MTL to play back video without the GPU, and the GPU can be used as a parallel processing powerhouse. Features like Intel QuickSync will be built into this unit. What is interesting is that this unit will be made on a separate tile, which will be fused with the rest using tile-based manufacturing found in Ponte Vecchio (which has 47 tiles).

Intel Linux Patches said:
Starting with [Meteor Lake], media functionality has moved into a new, second GT at the hardware level. This new GT, referred to as "standalone media" in the spec, has its own GuC, power management/forcewake, etc. The general non-engine GT registers for standalone media start at 0x380000, but otherwise use the same MMIO offsets as the primary GT.

Standalone media has a lot of similarity to the remote tiles present on platforms like [Xe HP Software Development Vehicle] and [Ponte Vecchio], and our i915 [kernel graphics driver] implementation can share much of the general "multi GT" infrastructure between the two types of platforms.


View at TechPowerUp Main Site | Source
 
I hope this means we can get the F series CPU that minus the IGPU but retains their video encode/decode block. Or that we can disable the IGPU while retaining the video encode/decode in K and non K series. The video engine from Intel is miles ahead of the competition, with the exception of Apple M series that is.
 
Can't wait for 4 manufacturing node:)
 
I wonder if this is what AMD is trying to do with their new GPUs. Take everything but the CUs(if I am understand it right) and move it to another chiplet/tile. Later they can just increase the size of the chiplet/tile that houses the CUs, or/and keep making everything else, like the media engine, at an older process, what they are doing with the I/O in their CPUs.
 
Besides easier manufacturing, is there an actual advantage to this? I mean, GPUs can power gate unused areas anyway, so they're basically turned off while decoding video (unless using shaders for post-processing).
 
Hi,
Seems likely a clash in drivers for audio and graphic's and quality to coming soon
Adding complexity where none is needed

Personally if I use a graphic's card I'd prefer it's used primarily and only if I didn't use one then onboard graphic's would be preferred for obvious reasons.

Maybe it's just to make browser hardware acceleration easier to maintain ?
 
I hope this means we can get the F series CPU that minus the IGPU but retains their video encode/decode block. Or that we can disable the IGPU while retaining the video encode/decode in K and non K series. The video engine from Intel is miles ahead of the competition, with the exception of Apple M series that is.
Yes, the logical assumption would be that GPU means iGPU here.
 
Besides easier manufacturing, is there an actual advantage to this? I mean, GPUs can power gate unused areas anyway, so they're basically turned off while decoding video (unless using shaders for post-processing).
Not so much easier manufacturing as lowering defect rates to increase product value. Video encode/decode is relatively uncomplicated fixed-function hardware that has a low failure rate during manufacture, versus something intricate like a full-blown GPU that is far more likely to end up defective. Makes complete sense to split the two so that more CPUs ship with functioning video decode/encode (which let's be honest, is all that 99.9% of Intel's iGPUs are used for).
 
  • Like
Reactions: bug
Not so much easier manufacturing as lowering defect rates to increase product value. Video encode/decode is relatively uncomplicated fixed-function hardware that has a low failure rate during manufacture, versus something intricate like a full-blown GPU that is far more likely to end up defective. Makes complete sense to split the two so that more CPUs ship with functioning video decode/encode (which let's be honest, is all that 99.9% of Intel's iGPUs are used for).
Yeah, poor choice of words. Manufacturing is actually more complicated, but when you throw out smaller dies, you get to churn out more SKUs per wafer, that's what I meant. Not to mention you can build this on a not-so-cutting-edge node, since it's not that performance critical.
 
I don't understand.

Intel IGPs have fixed-function decode/encode in a media block already, and have done for at least a decade. That fixed-function block was previously under the umbrella of "IGP" but that was merely a naming convention, it was part of the CPU silicon and not actually responsible for display output or GPU functions.

As far as I can tell (please correct me if I'm missing something fundamental) this announcement is just a shift in the way the existing encode/decode FF hardware is named. They call it a tile now, rather than a logic block, and that makes their block diagrams tidier to look at, but the end result is that it's still baked into the CPU die as before.

Potentially the only benefit or change that I can imagine is that quicksync will potentially be available for the SKUs ending in F.
 
IMO Intel graphic is always the best when use for low power application and quicksync, stable and functional.

I do wanna chime in here, 12600k so currently still the lastest Intel has to offer and its not actually THAT stable, watching youtube vids on it the screen sometimes go black and when I ahve 2 videos open and one playing sometimes it glitches out and copies that vid to the other window and I have to reload it to get it back to normal.

not a huge issue but not as stable as I want it.

On topic:

So now we have an even ermm smaller igpu?
I wonder if this means that that specific line of Intel gpu's without igpu's (I think the F designation) can then still do video etc with meteorlake.
 
I don't think Intel's gonna make a separate tile just for A/V from the looks of it this is just a slight name change.
 
Hi,
Seems likely a clash in drivers for audio and graphic's and quality to coming soon
Adding complexity where none is needed

Personally if I use a graphic's card I'd prefer it's used primarily and only if I didn't use one then onboard graphic's would be preferred for obvious reasons.

Maybe it's just to make browser hardware acceleration easier to maintain ?
For decoding intel media engine is actually the best. They can decode stuff that even nvidia can’t and much, much faster too. It’s why intel based system tend to always beat AMD for video editing.
ACFD2A46-898A-4E44-913D-A37971645E1A.jpeg
 
Hi,
PhysX I often switch to cpu instead of auto nvidia cp so guess all this is just no big deal of a change.
 
I don't understand.

Intel IGPs have fixed-function decode/encode in a media block already, and have done for at least a decade. That fixed-function block was previously under the umbrella of "IGP" but that was merely a naming convention, it was part of the CPU silicon and not actually responsible for display output or GPU functions.

As far as I can tell (please correct me if I'm missing something fundamental) this announcement is just a shift in the way the existing encode/decode FF hardware is named. They call it a tile now, rather than a logic block, and that makes their block diagrams tidier to look at, but the end result is that it's still baked into the CPU die as before.

Potentially the only benefit or change that I can imagine is that quicksync will potentially be available for the SKUs ending in F.
As far as I understand, it's still a block, but not within the CPU. It's a chiplet (or whatever Intel calls it) on its own now.
 
As far as I understand, it's still a block, but not within the CPU. It's a chiplet (or whatever Intel calls it) on its own now.
Tile, but "tile" is just intel's internal name for that logic block. It's still monolithic silicon, not an MCP like Ryzens.

Edit:
Wait, that's true for Alder Lake, Meteor Lake may actually be true tiles (which are effectively chiplets like AMD's MCPs under a slightly different interposer and trademarked name)
 
Last edited:
  • Like
Reactions: bug
Sounds like what the Neural Engine does in Apple Silicon. The NE, when leveraged properly, is quite impressive. For example, DXO PureRAW v2 uses the NE, where v1 used the iGPU. It knocks the processing time down on my 20MP RAW files from 20s to about 8s. If QS is using this hardware too, I suspect other programs like DXO PureRAW will as well, provided it’s easy for developers to implement. I suspect Intel would assist in that.
 
Besides easier manufacturing, is there an actual advantage to this? I mean, GPUs can power gate unused areas anyway, so they're basically turned off while decoding video (unless using shaders for post-processing).
What if, and a big what if, Intel is really abandoning GPU, want to keep their nice video encoding/decoding but want to use a third party GPU instead...

That would be a great first step toward that path. But at the same time, it's probably to bring closer to CPU those engine as maybe it make more sense for Intel if they choose to not compete on high end GPU
 
What if, and a big what if, Intel is really abandoning GPU, want to keep their nice video encoding/decoding but want to use a third party GPU instead...

That would be a great first step toward that path. But at the same time, it's probably to bring closer to CPU those engine as maybe it make more sense for Intel if they choose to not compete on high end GPU
Intriguing as that may be, it's not possible (or a smart move). IGPs are a must for laptops/ultrabooks. What would Intel use instead?
Also, abandoning GPUs right after they launch Arc? It's probably the worst moment in the history of Intel to do that.
 
Intriguing as that may be, it's not possible (or a smart move). IGPs are a must for laptops/ultrabooks. What would Intel use instead?
Also, abandoning GPUs right after they launch Arc? It's probably the worst moment in the history of Intel to do that.
There are rumours that intel want to ditch it's GPU division. I think it would be a mistake for them but they could outsource it to a company like PowerVR, Qualcomm or Nvidia. Maybe not on all SKU, they could keep a minimal igpu for some workload but they could add a tile for a third party in their package since they are going chiplets with Meteor lake and beyond.
 
iGPUs are essential, even in some very basic form (which is what Intel did for years with the 14nm CPUs). People who don’t game can get by with an iGPU, and almost every base work PC fits that bill. I’m surprised AMD didn’t include a GPU in all Ryzens for all these years, and it’s good to see them bringing it back with Zen4. Just being able to drive a display without a dGPU is great for troubleshooting.
 
Back
Top