• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Image quality: Intel Iris Pro vs. Radeon and GeForce

Joined
Feb 22, 2009
Messages
786 (0.13/day)
Processor Ryzen 7 5700X3D
Motherboard Asrock B550 PG Velocita
Cooling Thermalright Silver Arrow 130
Memory G.Skill 4000 MHz DDR4 32 GB
Video Card(s) XFX Radeon RX 7800XT 16 GB
Storage Plextor PX-512M9PEGN 512 GB
Display(s) 1920x1200; 100 Hz
Case Fractal Design North XL
Audio Device(s) SSL2
Software Windows 10 Pro 22H2
Benchmark Scores i've got a shitload of them in 15 years of TPU membership
I noticed a significant change in desktop picture quality when i had to switch to Intel HD graphics from dedicated. I've decided to do a short analysis of how different graphic accelerators present images in desktop, movies and gaming. Keep in mind that am using the latest software for all graphic media accelerators.


This is not about GeForce vs. Radeon general image quality, because it does not cover enough data to draw a good conclusion for 2018 trends, this is more about ''what can you instantly notice when comparing integrated graphics to dedicated''.

1. DESKTOP

Switching on to Intel Iris Pro 6200 from GeForce 900 or Radeon 200 series gave me a dull first impression when looking at my desktop picture - the colors looked washed out, but i corrected the view by adjusting the Intel settings gamma slider from 1.0 to 0.7/0.8 - that pretty matched GeForce and Radeon view ''experience''. Keep in mind that adjusting Intel settings contrast and brightness absolutely did not help, only gamma fixed the issue. So first conclusion - by default Intel oversaturates the picture with too much bright.

2. MOVIES

GeForce The Great Gatsby

gatsby-geforce.jpg


Intel HD The Great Gatsby

gatsby-intel.jpg


I never thought my movies look dull on my GeForce cards until i compared the picture to Intel HD. Despite the fact that blacks look better on Intel HD, there is a bit too much redness in it. So for movies (h.264 codec) you have to do the opposite - lower the gamma in NVIDIA options to aproximately. 0.75 to match Intel default picture quality, although it does not exactly become ''the same''.

Radeon The Great Gatsby

gatsby-radeon.jpg


Compared to Intel, Radeon provides about the same contrast and depth, but the colors look more accurate, not so redish. Compared to GeForce, Radeon view looks just better.

Intel HD U2 concert Blueray monochrome

u2-intel.jpg


GeForce U2 concert Blueray monochrome

Compared to Intel HD, GeForce monochrome picture has more blue tone (you need a very keen eye for that), and the picture just looks a bit more grainy/noisy.

u2-geforce.jpg



GeForce Singing in the Rain

singing-in-the-rain-geforce.jpg


Intel Singing in the Rain

singing-in-the-rain-intel.jpg


Radeon Singing in the Rain

singing-in-the-rain-radeon.jpg


In this technicolor classic, the dull contrast on behalf on GeForce is super evident, and Radeon. having non of that also manages to surpass color tones of Intel HD.

Intel The Hobbit

hobbit-intel.jpg


Radeon The Hobbit

hobbit-radeon.jpg


The difference between Intel and Radeon picture quality when it comes to red color over-saturation is not evident in pictures like these, but still the Radeon picture feels more warm

3. GAMES

Let's start from the observation that there might not be any noticeable difference between any of the vga-tors. Taking also into consideration is the fact that different image technologies are provided and supported by software developers for their products, and that accounts for different image quality in many games when it comes to NVIDIA vs. AMD vs. Intel.

The Vanishing of Ethan Carter. From left to right - Intel HD, Radeon, GeForce.

vanishing-of-ethan-carter.jpg


Nevermind the weather effects, the difference of the whole picture is negligent.

Serious Sam 3. From left to right - Intel HD, Radeon, GeForce.

serisou-sam-3.jpg


Another example of same image quality and there are plenty of such games; for instance Doom (2016) looks the same on GeForce and Radeon whether the API is the same or not. Crysis looks the same on all video accelerators does not matter the DirectX mode. In Dead Space 2, Radeon and Intel provided slightly better dark levels than GeForce. NVIDIA optimized games with HBAO+ support have a slight edge on AMD optimized games with HDAO support, but it's not about that. Let's continue to what really bothered me. It is Starcraft HD Remastered!!

In real file the difference is way more evident than watching through your browser, so this comparison does not do full justice...

Starcraft HD Remastered running of GeForce

starcraft-geforce.jpg


Starcraft HD Remastered running on Intel HD

starcraft-intel.jpg


This was enough to open my jaw wide open in disgust. Not only the quality of Intel HD looks like shit, it lacks a lot of detail that is present both on GeForce and Radeon accelerators. Let's continue with less striking example.

Prey running on GeForce

prey-geforce.jpg


Prey running on Radeon

prey-radeon.jpg


Prey running on Intel

prey-intel.jpg


While there is noticeable difference between GeForce and Radeon, and it is seriously arguable which accelerator provides a better picture, it's clear that Intel HD graphics once again lacks detail compared to dedicated graphics, though it's not so striking as in Starcraft HD.

I guess that is enough to make some valid points - not only do you not typically game on Intel HD graphics, even if it is a flagship high end Iris Pro model, embedded in my 14 nm Broadwell chip that equals GT 1030 in performance, it's lack of image quality in some games can make you disgust.
On the other hand, watching movies on GeForce GPU is not the best idea either... With Radeon GPU you don't have to adjust or change anything though, that's what i liked about it from the short amount of time i had it.
 
Last edited:
I've tested picture quality on my Sony X800D 43 incher - comparing the render quality of watching a movie directly through USB verus through HDMI cable from my Radeon GPU.

I launched a movie from a PC with Windows 10 X64, using Radeon as my main GPU, rendering the H.264 movie through MADVR renderer, using copy-back acceleration. The move was 1080P, upscaled to 4K. Even though Radeon provided the best picture quality vs. Intel and GeForce, when compared to native Sony X800D quality watching directly from USB, the Radeon looked shity. I can not imagine how shity would GeForce look. Glad i bought a TV, watching movies through a PC is a poor experience, no mater how good your monitor might be.

The absolute worst case scenario: watch a movie on a TN panel with GeForce GPU...
 
Did you set video color range in nvcp to full ? Default is limited.
 
Did you set video color range in nvcp to full ? Default is limited.

I tested only Radeon GPU on my TV, but yes, i did set it to 10 bit, and the native view was still better - mostly the colors were not washed so much compared to Radeon.
 
I think one of the key aspects of the better picture quality on my TV also is the output format and great color volume it has: the DCI color space looks better when compared to RGB when both are viewed from a TV point of view, and both NVIDIA and AMD use RGB for signal processing. I think the RGB signal being processed from Radeon through the HDMI cable can not compete with native DCI output directly from TV. I mean, is it not compressed? I don't think it even can support full range of colors. Even if it did, the quality would STILL be worse than Sony's DCI.

I think i won't even bother with GeForce, there is no way in hell it will look better.
 
Last edited:
Is the Iris Pro in the 5770C/5675C actually as fast as GT 1030? That would make it as fast as or faster than the 2400G.:eek:

Also that's Intel all over. Cheating :)
 
Back
Top