• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Bad image quality for NVIDIA GPUs at default settings

Status
Not open for further replies.
I must be missing something as well with the images. Looking at the 660Ti SLI vs 7950 Crossfire, it occurs to me that the 7950 image looks much closer to my Nvidia card images in Sleeping Dogs than the 660 SLI screenie does. I think it's merely a matter of better cards from both camps produce better pictures. It's a really bad example to use to try and make a point.

@GreiverBlade I too prefer to set my adjustments in-game as well, rather than forcing anything from the driver.
 
You can still do both and only the applicable ones will be adjusted.
nah ... driver override is not acceptable for me i never touch those settings, in game is the way to go ;)
 
nah ... driver override is not acceptable for me i never touch those settings, in game is the way to go ;)

I agree... part of me is practically OCD about that. Unless the game doesn't support AA or something silly, I almost never use driver overrides.
 
nah ... driver override is not acceptable for me i never touch those settings, in game is the way to go ;)
Not if its a game like Skyrim that uses a really crappy way of doing AF.
 
Not if its a game like Skyrim that uses a really crappy way of doing AF.
AF? autofocus? o_O oh silly me ... AF ... Anisotropic Filtering ... :oops::p right?

my Skyrim settings are always ruled by a no AA nor anisotropic filtering + ENB and custom ini
 
OP is right, but worded in such a way that people are reading into it too much. AMD and nVidia (and Intel) have their own way of rendering a specific effect, and thus the way the screen is rendered will have subtle differences. Example: In the Witcher 3, nVidia renders the gradient for shadows in a way that the effect is 'smooth', whereas AMD the same shadow will appear 'dithered'. It really depend on the game. Some titles you will see absolutely no difference (usually from smaller studios and indie games), while AAA titles where they are usually endorsed (sponsored?) by either nVidia or AMD. Going back to the Witcher 3, nVidia handles AA like a champ while I can not even override the option in CCC. At the same time, I can go into CCC and override the nVidia tessellation of the game and use AMD's specific renderers to achieve next to no performance loss for Hairworks.

So OP is absolutely correct! Some titles look 'worse' on nVidia, and likewise some look 'worse' for AMD.

However, it is all moot unless you have some sort of OCD and spend more time standing still and going over the scene with a fine tooth comb rather than enjoying an amazing game.

:toast:
 
Nope. The first pair compares the same GPU to different settings, and Watch Dogs is an AMD title. I could see a point in this thread if the first pictures were of AMD vs NVIDIA.
Watch Dogs is not an AMD title, it was even bundled with NVidia cards for awhile and makes use of Game Works. That game displayed is Sleeping dogs which is part of AMD's program.

Meh, I am not sure honestly but I have heard some people say color reproduction is better on AMD. Personally I have never thought about comparing it on the same screen but I could only with my laptop (GTX 675m) and desktop (R9 290X) but they are completely different cards and ages. It could be true but I have never thought about it.
 
whoever made that first pic is a liar, they must have set in game AF to disabled then let the driver do nothing on the right white the left is forced AF.... that's nonsensical, i have my stuff on default (yes the minor nvidia optimizations are enabled) & things are fine, i always use in game settings & override only if the game lacks its own AF

second pic may be a bug or time of day or who knows... it's odd that the shopkeeper's light is off

these are very obvious differences but unclear reasons or settings, not sure how someone cant see them, they're even highlighted

this has nothing to do with the user or what you personally like, this is about potentially higher benchmarks in reviews

i also dont see why a 660 matters, if it's the same settings then it should look the same, this is the first time i'm hearing games are supposed to look different just cuz they're 'optimized' for amd (grid?? the only thing that should be different is if you use intel with that special smoke enabled)

...i didnt quote the people i'm replying to
 
Meh, I am not sure honestly but I have heard some people say color reproduction is better on AMD. Personally I have never thought about comparing it on the same screen but I could only with my laptop (GTX 675m) and desktop (R9 290X) but they are completely different cards and ages. It could be true but I have never thought about it.

It has do to with the default color signals
 
I remember back in the day GPU reviews did include screenshot captures to compare quality. I enjoyed that, I wish it would come back.

I always use "High Quality" over anything lower, not because I have ever noticed anything wrong, I am just like that :P
 
Interesting, Hey, the problem is what settings do reviewers use in their bench? DEFAULT, mostly. And if those default settings offer poorer image quality in order to have an edge in fps, it's time to question nVidia.

It is reported that for a TitanX in BF4, changing from "let the 3D application decide" to a custom driver override setting which meets FuryX's quality witness a 7fps hit, not sure in which res though.
http://www.overclock.net/t/1563386/...cn-vs-nvidia-maxwell-and-kepler#post_24126882
 
If you want a serious thread drop the sleeping dogs example. That was a well known and very game specific issue that's just going to make people roll their eyes at the op's premise.
 
no it doesnt/shouldnt, by that i mean it's incredibly obvious when a monitor is running in RGB limited mode over hdmi, i sure hope people that are switching/comparing vendors realize something is horribly wrong when blacks are grey... it's another reason why hdmi & tv standards suck (similarly, a lot of amd users complain about the overscan setting)

to make matters worse, some monitors (like mine) make a mess when hdmi is used, so i have to use limited on the ps3 or any other hdmi device, then adjust my brightness/contrast as best i can

the chatter & attempted screenshots over the years attempt to show small changes in color or details, in the past i've really disliked nvidia's MSAA compared to ati's, they also had worse AF until ~2008 or 2009

btw that article is out of date, sometime last year nv updated nvcp to give you a dropdown for the RGB mode

EDIT: i dont mean to make the historical context sound so biased, so let's note that FCAT had to be updated since amd had small amounts of noise in the image that confused the color (frame) detection
 
Last edited:
no it doesnt/shouldnt, by that i mean it's incredibly obvious when a monitor is running in RGB limited mode over hdmi, i sure hope people that are switching/comparing vendors realize something is horribly wrong when blacks are grey... it's another reason why hdmi & tv standards suck (similarly, a lot of amd users complain about the overscan setting)

to make matters worse, some monitors (like mine) make a mess when hdmi is used, so i have to use limited on the ps3 or any other hdmi device, then adjust my brightness/contrast as best i can

the chatter & attempted screenshots over the years attempt to show small changes in color or details, in the past i've really disliked nvidia's MSAA compared to ati's, they also had worse AF until ~2008 or 2009

btw that article is out of date, sometime last year nv updated nvcp to give you a dropdown for the RGB mode

EDIT: i dont mean to make the historical context sound so biased, so let's note that FCAT had to be updated since amd had small amounts of noise in the image that confused the color (frame) detection

Not so obvious. G-Sync Monitors and Limited Range RGB 16-235 on NVIDIA GPUs

The question was as to defaults and when your using a lesser color signal you save on bandwidth.

Limited
aano8.jpg


Full
fdnQM.jpg
 
Last edited:
AMD and Nvidia user here, first thing I do after installing new drivers in every PC I own is max out the default IQ settings in the driver settings.

On both my AMD and Nvidia systems, barring some brand especific effects like gameworks, or HBAO+, IQ looks the same.

I remember a time when Tom's hardware and anantech used to compare the differences between both companies in terms of anisotropic filtering and antialiasing techniques, but such differences became so indistinguishable between both teams, comparisons were just dropped altogether.

I cannot speak for default driver settings IQ, but I can tell you from my own empirical experience, that maxed out, both teams offer the same degree of IQ; chances are default settings won't even matter for benching, as most hardware review sites list the IQ settings used for every game to make the comparisons equal, as such settings usually override whatever the driver IQ is set to.

IMHO, this topic is not even worth discussing, the times when both teams were caught red handed using especific 'low quality' settings when a given executable was launched to cheat in a few games or apps, is long behind us.

Not worth wasting W1zzard's (or anyone else's) time on such trivial topic as far as I'm concerned.
 
It's not a cheat. As I said from @Xzibit's link, it doesn't affect DVI. A lot of folk still use DVI.
As the article alluded to, both NV and AMD have issues, Nv's being worse but it is a legacy issue that hopefully gets resolved as DP and HDMI evolve.
It would be good to see how many folks still use DVI.
 
AMD and Nvidia do use different color schemes as they are based on similar but different technologies...this is fact and is not in question what so ever...AMD does use a larger color spectrum by default and it makes no difference to most people however this makes no difference in fps either...

I think people are confusing this with performance....in the end it's only a matter of preference to some people.
 
AMD and Nvidia do use different color schemes as they are based on similar but different technologies...this is fact and is not in question what so ever...AMD does use a larger color spectrum by default and it makes no difference to most people however this makes no difference in fps either...

I think people are confusing this with performance....in the end it's only a matter of preference to some people.

The screenshots in the original post clearly show the difference in rendering (sharpness/details/lighting) rather than colors so your message is kinda off the mark.
 
The screenshots in the original post clearly show the difference in rendering (sharpness/details/lighting) rather than colors so your message is kinda off the mark.

The thread has progressed to discussion on colour as this is, courtesy of xzibit's link, a definite and tangible, inherent issue with colour use over DP or HDMI for Nvidia. As is scaling with AMD.
The screenshot is bogus as an unbiased piece of evidence. Using an ancient title that waaaaay back prompted discussion on vendor specific FX is shoddy for 2015.
I used 7970's as my last AMD set up a couple years ago. Moved to a Titan (for single GPU reasons) and there were no noticeable differences to me.
Of course I enjoyed the tesselated everything that Nvidia crammed in, even on unseen areas (water). Now that is what goes on nowadays, find out what the competing architecture does worse and shove a tonne of it into the game you help develop.
Years back, AMD did it with Sleeping Dogs and one of the Grid(?) titles. Nvidia do it quite a lot on everything they touch.
 
This all started at wccftech (the conspiracy) why am I not surprised :rolleyes:
After the gtx 970 fiasco every fanboy out there is looking for problems where there aren't none :shadedshu: (espically now after the fury x got ripped)

:pimp::pimp::pimp:
 
Status
Not open for further replies.
Back
Top