• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Image quality: Intel Iris Pro vs. Radeon and GeForce

Joined
Feb 22, 2009
Messages
762 (0.14/day)
System Name Lenovo 17IMH05H
Processor Core i7 10750H
Video Card(s) GTX 1660 Ti
Audio Device(s) SSL2
Software Windows 10 Pro 22H2
Benchmark Scores i've got a shitload of them in 15 years of TPU membership
I noticed a significant change in desktop picture quality when i had to switch to Intel HD graphics from dedicated. I've decided to do a short analysis of how different graphic accelerators present images in desktop, movies and gaming. Keep in mind that am using the latest software for all graphic media accelerators.


This is not about GeForce vs. Radeon general image quality, because it does not cover enough data to draw a good conclusion for 2018 trends, this is more about ''what can you instantly notice when comparing integrated graphics to dedicated''.

1. DESKTOP

Switching on to Intel Iris Pro 6200 from GeForce 900 or Radeon 200 series gave me a dull first impression when looking at my desktop picture - the colors looked washed out, but i corrected the view by adjusting the Intel settings gamma slider from 1.0 to 0.7/0.8 - that pretty matched GeForce and Radeon view ''experience''. Keep in mind that adjusting Intel settings contrast and brightness absolutely did not help, only gamma fixed the issue. So first conclusion - by default Intel oversaturates the picture with too much bright.

2. MOVIES

GeForce The Great Gatsby



Intel HD The Great Gatsby



I never thought my movies look dull on my GeForce cards until i compared the picture to Intel HD. Despite the fact that blacks look better on Intel HD, there is a bit too much redness in it. So for movies (h.264 codec) you have to do the opposite - lower the gamma in NVIDIA options to aproximately. 0.75 to match Intel default picture quality, although it does not exactly become ''the same''.

Radeon The Great Gatsby



Compared to Intel, Radeon provides about the same contrast and depth, but the colors look more accurate, not so redish. Compared to GeForce, Radeon view looks just better.

Intel HD U2 concert Blueray monochrome



GeForce U2 concert Blueray monochrome

Compared to Intel HD, GeForce monochrome picture has more blue tone (you need a very keen eye for that), and the picture just looks a bit more grainy/noisy.




GeForce Singing in the Rain



Intel Singing in the Rain



Radeon Singing in the Rain



In this technicolor classic, the dull contrast on behalf on GeForce is super evident, and Radeon. having non of that also manages to surpass color tones of Intel HD.

Intel The Hobbit



Radeon The Hobbit



The difference between Intel and Radeon picture quality when it comes to red color over-saturation is not evident in pictures like these, but still the Radeon picture feels more warm

3. GAMES

Let's start from the observation that there might not be any noticeable difference between any of the vga-tors. Taking also into consideration is the fact that different image technologies are provided and supported by software developers for their products, and that accounts for different image quality in many games when it comes to NVIDIA vs. AMD vs. Intel.

The Vanishing of Ethan Carter. From left to right - Intel HD, Radeon, GeForce.



Nevermind the weather effects, the difference of the whole picture is negligent.

Serious Sam 3. From left to right - Intel HD, Radeon, GeForce.



Another example of same image quality and there are plenty of such games; for instance Doom (2016) looks the same on GeForce and Radeon whether the API is the same or not. Crysis looks the same on all video accelerators does not matter the DirectX mode. In Dead Space 2, Radeon and Intel provided slightly better dark levels than GeForce. NVIDIA optimized games with HBAO+ support have a slight edge on AMD optimized games with HDAO support, but it's not about that. Let's continue to what really bothered me. It is Starcraft HD Remastered!!

In real file the difference is way more evident than watching through your browser, so this comparison does not do full justice...

Starcraft HD Remastered running of GeForce



Starcraft HD Remastered running on Intel HD



This was enough to open my jaw wide open in disgust. Not only the quality of Intel HD looks like shit, it lacks a lot of detail that is present both on GeForce and Radeon accelerators. Let's continue with less striking example.

Prey running on GeForce



Prey running on Radeon



Prey running on Intel



While there is noticeable difference between GeForce and Radeon, and it is seriously arguable which accelerator provides a better picture, it's clear that Intel HD graphics once again lacks detail compared to dedicated graphics, though it's not so striking as in Starcraft HD.

I guess that is enough to make some valid points - not only do you not typically game on Intel HD graphics, even if it is a flagship high end Iris Pro model, embedded in my 14 nm Broadwell chip that equals GT 1030 in performance, it's lack of image quality in some games can make you disgust.
On the other hand, watching movies on GeForce GPU is not the best idea either... With Radeon GPU you don't have to adjust or change anything though, that's what i liked about it from the short amount of time i had it.
 
Last edited:
Joined
Feb 22, 2009
Messages
762 (0.14/day)
System Name Lenovo 17IMH05H
Processor Core i7 10750H
Video Card(s) GTX 1660 Ti
Audio Device(s) SSL2
Software Windows 10 Pro 22H2
Benchmark Scores i've got a shitload of them in 15 years of TPU membership
I've tested picture quality on my Sony X800D 43 incher - comparing the render quality of watching a movie directly through USB verus through HDMI cable from my Radeon GPU.

I launched a movie from a PC with Windows 10 X64, using Radeon as my main GPU, rendering the H.264 movie through MADVR renderer, using copy-back acceleration. The move was 1080P, upscaled to 4K. Even though Radeon provided the best picture quality vs. Intel and GeForce, when compared to native Sony X800D quality watching directly from USB, the Radeon looked shity. I can not imagine how shity would GeForce look. Glad i bought a TV, watching movies through a PC is a poor experience, no mater how good your monitor might be.

The absolute worst case scenario: watch a movie on a TN panel with GeForce GPU...
 
Joined
Aug 6, 2017
Messages
7,412 (3.02/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
Did you set video color range in nvcp to full ? Default is limited.
 
Joined
Feb 22, 2009
Messages
762 (0.14/day)
System Name Lenovo 17IMH05H
Processor Core i7 10750H
Video Card(s) GTX 1660 Ti
Audio Device(s) SSL2
Software Windows 10 Pro 22H2
Benchmark Scores i've got a shitload of them in 15 years of TPU membership
Did you set video color range in nvcp to full ? Default is limited.

I tested only Radeon GPU on my TV, but yes, i did set it to 10 bit, and the native view was still better - mostly the colors were not washed so much compared to Radeon.
 
Joined
Aug 6, 2017
Messages
7,412 (3.02/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
Joined
Feb 22, 2009
Messages
762 (0.14/day)
System Name Lenovo 17IMH05H
Processor Core i7 10750H
Video Card(s) GTX 1660 Ti
Audio Device(s) SSL2
Software Windows 10 Pro 22H2
Benchmark Scores i've got a shitload of them in 15 years of TPU membership
I think one of the key aspects of the better picture quality on my TV also is the output format and great color volume it has: the DCI color space looks better when compared to RGB when both are viewed from a TV point of view, and both NVIDIA and AMD use RGB for signal processing. I think the RGB signal being processed from Radeon through the HDMI cable can not compete with native DCI output directly from TV. I mean, is it not compressed? I don't think it even can support full range of colors. Even if it did, the quality would STILL be worse than Sony's DCI.

I think i won't even bother with GeForce, there is no way in hell it will look better.
 
Last edited:
Joined
Dec 27, 2013
Messages
887 (0.24/day)
Location
somewhere
Is the Iris Pro in the 5770C/5675C actually as fast as GT 1030? That would make it as fast as or faster than the 2400G.:eek:

Also that's Intel all over. Cheating :)
 
Top