- Joined
- Feb 22, 2009
- Messages
- 786 (0.13/day)
Processor | Ryzen 7 5700X3D |
---|---|
Motherboard | Asrock B550 PG Velocita |
Cooling | Thermalright Silver Arrow 130 |
Memory | G.Skill 4000 MHz DDR4 32 GB |
Video Card(s) | XFX Radeon RX 7800XT 16 GB |
Storage | Plextor PX-512M9PEGN 512 GB |
Display(s) | 1920x1200; 100 Hz |
Case | Fractal Design North XL |
Audio Device(s) | SSL2 |
Software | Windows 10 Pro 22H2 |
Benchmark Scores | i've got a shitload of them in 15 years of TPU membership |
I noticed a significant change in desktop picture quality when i had to switch to Intel HD graphics from dedicated. I've decided to do a short analysis of how different graphic accelerators present images in desktop, movies and gaming. Keep in mind that am using the latest software for all graphic media accelerators.
This is not about GeForce vs. Radeon general image quality, because it does not cover enough data to draw a good conclusion for 2018 trends, this is more about ''what can you instantly notice when comparing integrated graphics to dedicated''.
1. DESKTOP
Switching on to Intel Iris Pro 6200 from GeForce 900 or Radeon 200 series gave me a dull first impression when looking at my desktop picture - the colors looked washed out, but i corrected the view by adjusting the Intel settings gamma slider from 1.0 to 0.7/0.8 - that pretty matched GeForce and Radeon view ''experience''. Keep in mind that adjusting Intel settings contrast and brightness absolutely did not help, only gamma fixed the issue. So first conclusion - by default Intel oversaturates the picture with too much bright.
2. MOVIES
GeForce The Great Gatsby
Intel HD The Great Gatsby
I never thought my movies look dull on my GeForce cards until i compared the picture to Intel HD. Despite the fact that blacks look better on Intel HD, there is a bit too much redness in it. So for movies (h.264 codec) you have to do the opposite - lower the gamma in NVIDIA options to aproximately. 0.75 to match Intel default picture quality, although it does not exactly become ''the same''.
Radeon The Great Gatsby
Compared to Intel, Radeon provides about the same contrast and depth, but the colors look more accurate, not so redish. Compared to GeForce, Radeon view looks just better.
Intel HD U2 concert Blueray monochrome
GeForce U2 concert Blueray monochrome
Compared to Intel HD, GeForce monochrome picture has more blue tone (you need a very keen eye for that), and the picture just looks a bit more grainy/noisy.
GeForce Singing in the Rain
Intel Singing in the Rain
Radeon Singing in the Rain
In this technicolor classic, the dull contrast on behalf on GeForce is super evident, and Radeon. having non of that also manages to surpass color tones of Intel HD.
Intel The Hobbit
Radeon The Hobbit
The difference between Intel and Radeon picture quality when it comes to red color over-saturation is not evident in pictures like these, but still the Radeon picture feels more warm
3. GAMES
Let's start from the observation that there might not be any noticeable difference between any of the vga-tors. Taking also into consideration is the fact that different image technologies are provided and supported by software developers for their products, and that accounts for different image quality in many games when it comes to NVIDIA vs. AMD vs. Intel.
The Vanishing of Ethan Carter. From left to right - Intel HD, Radeon, GeForce.
Nevermind the weather effects, the difference of the whole picture is negligent.
Serious Sam 3. From left to right - Intel HD, Radeon, GeForce.
Another example of same image quality and there are plenty of such games; for instance Doom (2016) looks the same on GeForce and Radeon whether the API is the same or not. Crysis looks the same on all video accelerators does not matter the DirectX mode. In Dead Space 2, Radeon and Intel provided slightly better dark levels than GeForce. NVIDIA optimized games with HBAO+ support have a slight edge on AMD optimized games with HDAO support, but it's not about that. Let's continue to what really bothered me. It is Starcraft HD Remastered!!
In real file the difference is way more evident than watching through your browser, so this comparison does not do full justice...
Starcraft HD Remastered running of GeForce
Starcraft HD Remastered running on Intel HD
This was enough to open my jaw wide open in disgust. Not only the quality of Intel HD looks like shit, it lacks a lot of detail that is present both on GeForce and Radeon accelerators. Let's continue with less striking example.
Prey running on GeForce
Prey running on Radeon
Prey running on Intel
While there is noticeable difference between GeForce and Radeon, and it is seriously arguable which accelerator provides a better picture, it's clear that Intel HD graphics once again lacks detail compared to dedicated graphics, though it's not so striking as in Starcraft HD.
I guess that is enough to make some valid points - not only do you not typically game on Intel HD graphics, even if it is a flagship high end Iris Pro model, embedded in my 14 nm Broadwell chip that equals GT 1030 in performance, it's lack of image quality in some games can make you disgust.
On the other hand, watching movies on GeForce GPU is not the best idea either... With Radeon GPU you don't have to adjust or change anything though, that's what i liked about it from the short amount of time i had it.
This is not about GeForce vs. Radeon general image quality, because it does not cover enough data to draw a good conclusion for 2018 trends, this is more about ''what can you instantly notice when comparing integrated graphics to dedicated''.
1. DESKTOP
Switching on to Intel Iris Pro 6200 from GeForce 900 or Radeon 200 series gave me a dull first impression when looking at my desktop picture - the colors looked washed out, but i corrected the view by adjusting the Intel settings gamma slider from 1.0 to 0.7/0.8 - that pretty matched GeForce and Radeon view ''experience''. Keep in mind that adjusting Intel settings contrast and brightness absolutely did not help, only gamma fixed the issue. So first conclusion - by default Intel oversaturates the picture with too much bright.
2. MOVIES
GeForce The Great Gatsby

Intel HD The Great Gatsby

I never thought my movies look dull on my GeForce cards until i compared the picture to Intel HD. Despite the fact that blacks look better on Intel HD, there is a bit too much redness in it. So for movies (h.264 codec) you have to do the opposite - lower the gamma in NVIDIA options to aproximately. 0.75 to match Intel default picture quality, although it does not exactly become ''the same''.
Radeon The Great Gatsby

Compared to Intel, Radeon provides about the same contrast and depth, but the colors look more accurate, not so redish. Compared to GeForce, Radeon view looks just better.
Intel HD U2 concert Blueray monochrome

GeForce U2 concert Blueray monochrome
Compared to Intel HD, GeForce monochrome picture has more blue tone (you need a very keen eye for that), and the picture just looks a bit more grainy/noisy.

GeForce Singing in the Rain

Intel Singing in the Rain

Radeon Singing in the Rain

In this technicolor classic, the dull contrast on behalf on GeForce is super evident, and Radeon. having non of that also manages to surpass color tones of Intel HD.
Intel The Hobbit

Radeon The Hobbit

The difference between Intel and Radeon picture quality when it comes to red color over-saturation is not evident in pictures like these, but still the Radeon picture feels more warm
3. GAMES
Let's start from the observation that there might not be any noticeable difference between any of the vga-tors. Taking also into consideration is the fact that different image technologies are provided and supported by software developers for their products, and that accounts for different image quality in many games when it comes to NVIDIA vs. AMD vs. Intel.
The Vanishing of Ethan Carter. From left to right - Intel HD, Radeon, GeForce.

Nevermind the weather effects, the difference of the whole picture is negligent.
Serious Sam 3. From left to right - Intel HD, Radeon, GeForce.

Another example of same image quality and there are plenty of such games; for instance Doom (2016) looks the same on GeForce and Radeon whether the API is the same or not. Crysis looks the same on all video accelerators does not matter the DirectX mode. In Dead Space 2, Radeon and Intel provided slightly better dark levels than GeForce. NVIDIA optimized games with HBAO+ support have a slight edge on AMD optimized games with HDAO support, but it's not about that. Let's continue to what really bothered me. It is Starcraft HD Remastered!!
In real file the difference is way more evident than watching through your browser, so this comparison does not do full justice...
Starcraft HD Remastered running of GeForce

Starcraft HD Remastered running on Intel HD

This was enough to open my jaw wide open in disgust. Not only the quality of Intel HD looks like shit, it lacks a lot of detail that is present both on GeForce and Radeon accelerators. Let's continue with less striking example.
Prey running on GeForce

Prey running on Radeon

Prey running on Intel

While there is noticeable difference between GeForce and Radeon, and it is seriously arguable which accelerator provides a better picture, it's clear that Intel HD graphics once again lacks detail compared to dedicated graphics, though it's not so striking as in Starcraft HD.
I guess that is enough to make some valid points - not only do you not typically game on Intel HD graphics, even if it is a flagship high end Iris Pro model, embedded in my 14 nm Broadwell chip that equals GT 1030 in performance, it's lack of image quality in some games can make you disgust.
On the other hand, watching movies on GeForce GPU is not the best idea either... With Radeon GPU you don't have to adjust or change anything though, that's what i liked about it from the short amount of time i had it.
Last edited: