- Joined
- Aug 19, 2017
- Messages
- 3,251 (1.12/day)
Intel has introduced the Computer Graphics Visual Quality Metric (CGVQM) tool, the first open-source metric specifically designed to evaluate real-time game imagery with nearly the same accuracy as humans. At its core is the CG-VQD dataset, which contains 80 three-second clips from 15 open-source 3D scenes, ranging from Amazon's Bistro demo to custom environments like House and Bridge, each processed by one of six modern rendering methods, such as neural supersampling, path tracing, or Gaussian splatting. Intel researchers began with a pre-trained 3D ResNet‑18 network and fine-tuned its channel weights to align its outputs with volunteer ratings. The result, CGVQM‑5, outperforms all existing full‑reference metrics and comes very close to matching human agreement.
Behind the scenes, CGVQM splits each video into smaller patches, extracts key visual features using a 3D ResNet backbone, and then adjusts a small set of channel-wise weights so that its predicted scores closely match the quality ratings given by human testers. CGVQM 5 digs deep into all five ResNet blocks for top accuracy. To make the tool practical for swift build pipelines, the team also created CGVQM‑2, a lighter version that uses only the first two ResNet blocks. By removing most of the latter features, it runs substantially faster while still beating every rival metric. Both variants produce error maps that clearly highlight artifacts, such as ghosting or flicker, allowing artists to spot and fix issues without running complete user tests. Game developers can clone the GitHub repository and add Vulkan hooks or Unreal Engine plugins to integrate CGVQM directly into their workflows, enabling them to run evaluations on the fly.

View at TechPowerUp Main Site | Source
Behind the scenes, CGVQM splits each video into smaller patches, extracts key visual features using a 3D ResNet backbone, and then adjusts a small set of channel-wise weights so that its predicted scores closely match the quality ratings given by human testers. CGVQM 5 digs deep into all five ResNet blocks for top accuracy. To make the tool practical for swift build pipelines, the team also created CGVQM‑2, a lighter version that uses only the first two ResNet blocks. By removing most of the latter features, it runs substantially faster while still beating every rival metric. Both variants produce error maps that clearly highlight artifacts, such as ghosting or flicker, allowing artists to spot and fix issues without running complete user tests. Game developers can clone the GitHub repository and add Vulkan hooks or Unreal Engine plugins to integrate CGVQM directly into their workflows, enabling them to run evaluations on the fly.



View at TechPowerUp Main Site | Source