• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Microsoft Tech Chief Prefers Using NVIDIA AI GPUs, Keeping Tabs on AMD Alternatives

Not having a choice is not a preference. Preferring to subsidize a suppliers 70% margin is not a positive lol.
It's just a condensed headline + short article, you can only do so much within a certain word count. Feel free to read the full interview for more context.
All the same, he's had a frustrating time.
 
The last time this point of view had any validity was in the RX 5000 era. AMD drivers have been flawless since then. But an Nvidia-only user wouldn't know, right?

I have an AMD-based laptop, and even its 5600H's Vega iGPU somehow still manages to give me grief.

Not too long ago they broke the HDMI Audio driver. Instant BSOD upon access. That was fun to troubleshoot.

But just last night we had another totally isolated case here didn't we? Quite flawless, just ask @faye :rolleyes:
 
When Nvidia users use AMD cards, there will be no more driver issues. As only nivida users experience AMD driver issue, despite that they don't own and AMD card.
Uh... yeah no.

I use both myself, and I agree.
Don't ever go hdr on radeon binary windows drivers. Just don't. Especially don't turn on OpenGL tripple buffering at the same time as using something like SpecialK for hdr uplifts. Why? Who knows, but don't.

I use both myself, and I agree.
To be fair they (they being the binary windows drivers) are fine if you don't do anything the vast majority wouldn't do. But that sucks.

That's some Olympics grade mental gymnastics... Have you even considered that a lot of people use Nvidia only and exclusively because the AMD drivers are a laughing stock?
They aren't on linux though, because fanboys tend to get laughed out of linux land, and the community can actually get healthy and fix things.

The irony is incredible.

When Nvidia users use AMD cards, there will be no more driver issues. As only nivida users experience AMD driver issue, despite that they don't own and AMD card.
Also I hate to burst this wonderful theory, but I am an AMD user. My purchase chain is even trackable via this forum, friggin lol.
 
Last edited:
But just last night we had another totally isolated case here didn't we? Quite flawless, just ask @faye :rolleyes:
Tbf AMD's just doing what some hipster YT channels do these days ~ crowdsource their QA/QC to end users, kinda like that guy who was reamed by GN about a month back.

As they say the early bird gets the worm :laugh:
 
Uh... yeah no.


Don't ever go hdr on radeon binary windows drivers. Just don't. Especially don't turn on OpenGL tripple buffering at the same time as using something like SpecialK for hdr uplifts. Why? Who knows, but don't.


To be fair they (they being the binary windows drivers) are fine if you don't do anything the vast majority wouldn't do. But that sucks.


They aren't on linux though, because fanboys tend to get laughed out of linux land, and the community can actually get healthy and fix things.

The irony is incredible.


Also I hate to burst this wonderful theory, but I am an AMD user.

Yeah, agree. AMD on Linux is actually wonderful. Nvidia is a full on nightmare. Never got my 3090 to play nice with Linux and I bet Ada is even worse... :eek:

Tbf AMD's just doing what some hipster YT channels do these days ~ crowdsource their QA/QC to end users, kinda like that guy who was reamed by GN about a month back.

As they say the early bird gets the worm :laugh:

Oh hey I'd know a thing or two about that! And I would be proud to be part of it as I once was. If only they ever listened.
 
Really? I always turn it on. No issues.
Don't try hdr gaming with it. All games will generally cease to launch if opengl triple buffering is used in conjunction with an hdr native surface.

Why? Why even dx12 etc games? No one knows. It's a mystery that is more likely to get swept under the rug than solved sadly.

Tbf AMD's just doing what some hipster YT channels do these days ~ crowdsource their QA/QC to end users, kinda like that guy who was reamed by GN about a month back.

As they say the early bird gets the worm :laugh:
That's horrible because a lot of end users will play fanboy and rather than admit there is an issue, deny it until it (doesn't) go away.
 
That's partly true but you have to have those issues in the first place. The last time I used an AMD gpu, IGP or dGPU, was almost a decade back & yeah that was a horrible experience at that time because AMD stopped updating the drivers for their terrascale GPU. Purely in terms of stability that was no better or worse IMO than Intel or Nvidia, but again I didn't use it across a wide range of applications or games.
 
Don't try hdr gaming with it. All games will generally cease to launch if opengl triple buffering is used in conjunction with an hdr native surface.

Why? Why even dx12 etc games? No one knows. It's a mystery that is more likely to get swept under the rug than solved sadly.


That's horrible because a lot of end users will play fanboy and rather than admit there is an issue, deny it until it (doesn't) go away.
My new monitor is technically HDR400 capable, but I can only switch on HDR via its own menu. Windows doesn't recognise it as HDR for some reason. So I wouldn't know.

I still have 10-bit colour which works with triple buffering without any problems, though.
 
My new monitor is technically HDR400 capable, but I can only switch on HDR via its own menu. Windows doesn't recognise it as HDR for some reason. So I wouldn't know.

I still have 10-bit colour which works with triple buffering without any problems, though.
It has to either be a.) an hdr native game or b.) something uplifted to be hdr (using SpecialK or similar). sdr games being tonemapped via windows has always been fine.

It's an odd problem and on its own not a big deal... but thats the thing. Go off the beaten path at all and the driver quality differences will become night and day in no time.
 
It has to either be a.) an hdr native game or b.) something uplifted to be hdr (using SpecialK or similar). sdr games being tonemapped via windows has always been fine.
Then I'll probably never see this happen as my monitor isn't even recognised as HDR-capable in Windows.

It's an odd problem and on its own not a big deal... but thats the thing. Go off the beaten path at all and the driver quality differences will become night and day in no time.
You can say that about every brand. For example, my 4K TV won't do full RGB colours with 4:4:4 chroma at 60 Hz with an Nvidia GPU. It works just fine with Intel and AMD, but not with Nvidia.

This is a less unique problem than AMD's HDR with triple buffering issue (just disable triple buffering, what's the big deal?), but I'm still not bashing Nvidia for it. Blaming a manufacturer of not sorting out a problem that maybe 0.5% of the user base is concerned about is childish.
 
You can say that about every brand.
If I thought that about nvidias drivers I'd not have written what I did.

hdmi 2.1 Atmos dropouts is honestly the only bug I experienced with my time on nvidia. With amd I could (and have) make quite a list. I actually had a .txt list with workarounds and mitigations at one point, but as I'm mainly on linux now it got sent to the square bin a bit back.

AMD does have strengths (a way better OSS driver is one of them) but they are not in the same league at all if you use the binary windows driver.

For example, my 4K TV won't do full RGB colours with 4:4:4 chroma at 60 Hz with an Nvidia GPU. It works just fine with Intel and AMD, but not with Nvidia.
That sounds like an edid compliance issue more than a driver issue. Lots of tvs get the edid wrong though, that is why CRU exists.

just disable triple buffering, what's the big deal?
It isn't but the fact that an opengl setting impacts anything NOT opengl is sort of a troubling sign of something being very messy internally.

Of course, AMD is NOT sitting on their ass by any means. Even I will admit their DX11/OGL driver performance gains have been great recently, even if only because they had been neglected for literal decades and someone finaly noticed...
 
Last edited:
That sounds like an edid compliance issue more than a driver issue. Lots of tvs get the edid wrong though, that is why CRU exists.
Then why is there no issue with an Intel or AMD GPU?
 
Then why is there no issue with an Intel or AMD GPU?
Most likely they aren't interpreting the edid in a compliant way, and are just forcing it to run faster despite whats in there being out of spec. That works, until it doesn't, then its real bad. You usually would end up with corrupted picture or misbehaving monitor.

Older nvidia drivers functioned similar but they had a big "strict compliance" push recently for whatever reason. It also broke some displayport to hdmi adapters before they whitelisted them.

My guess is 4k@60hz 8bpp rgb is just borderline enough that you get away with it? Really they should use standard lcd timings though.
 
Most likely they aren't interpreting the edid in a compliant way, and are just forcing it to run faster despite whats in there being out of spec. That works, until it doesn't, then its real bad. You usually would end up with corrupted picture or misbehaving monitor.
I've been using it like this for years without any issue. There's no reason why it shouldn't work on Nvidia, but it doesn't.

My guess is 4k@60hz 8bpp rgb is just borderline enough that you get away with it? Really they should use standard lcd timings though.
I don't care what timings they use as long as it works. I don't want my TV to be compliant. I want it to work.

My point is that there are off the beaten path bugs in every driver, it's not specific to AMD. If you don't want any bugs, you shouldn't buy/build a computer.
 
I don't care what timings they use as long as it works. I don't want my TV to be compliant. I want it to work.
You might look at it that way and I might even agree with you if it wasn't literally a case of violating a specs set limits. That seldom ends well, and I can't/won't label code that avoids this as "buggy" honestly.

PS: if you use CRU odds are you can get that working on nvidia, too.
 
You might look at it that way and I might even agree with you if it wasn't literally a case of violating a specs set limits. That seldom ends well, and I can't/won't label code that avoids this as "buggy" honestly.

PS: if you use CRU odds are you can get that working on nvidia, too.
So it's only Nvidia that doesn't violate the spec, and I should be glad for it, even if it gives me ugly, washed-out colours and greyish blacks? I don't see any logic here. :wtf:

Like I said, my experience with AMD and Intel isn't buggy in this regard. Stuff just works that doesn't with Nvidia. That's it.

What's CRU?
 
What's CRU?
Custom Resolution Utility. Load it up and post a screenshot please, I can guide you through what is going on and almost certainly fix it.
So it's only Nvidia that doesn't violate the spec
They are following it strictly. It's not really something you should be "glad" for, just a design choice. I'm hessitant to call following a spec strictly a bug but it certainly is a PROBLEM in your instance. Splitting hairs perhaps a bit, I am infamous for that lol.
 
Custom Resolution Utility. Load it up and post a screenshot please, I can guide you through what is going on and almost certainly fix it.
It's a bit difficult with the missus occupying the TV during the weekend, but I'll try it when I get a chance. :D

They are following it strictly. It's not really something you should be "glad" for, just a design choice. I'm hessitant to call following a spec strictly a bug but it certainly is a PROBLEM in your instance. Splitting hairs perhaps a bit, I am infamous for that lol.
It is a problem, but it's fine as long as dual GPU works through the Intel iGPU (kind of like in the laptop world). :)

It's just an example to show that on contrary to common belief, even Nvidia's drivers aren't perfect.

Custom Resolution Utility. Load it up and post a screenshot please, I can guide you through what is going on and almost certainly fix it.
Just done it, it gives me this:
20231006_151054.jpg


On the other hand:
20231006_151557.jpg


Something doesn't add up.
 
It's a bit difficult with the missus occupying the TV during the weekend, but I'll try it when I get a chance. :D


It is a problem, but it's fine as long as dual GPU works through the Intel iGPU (kind of like in the laptop world). :)

It's just an example to show that on contrary to common belief, even Nvidia's drivers aren't perfect.


Just done it, it gives me this:
View attachment 316446

On the other hand:
View attachment 316447

Something doesn't add up.
Yeah, that's all kinds of wonky in the EDID. If you hit 3840x2160x30.00 entry and hit "edit," you can most likely fix it as follows:

Screenshot_20231006_145352.png
The dropdown and refresh rate are the important entries. Then OK-out of the program and reboot. If it does not work, it may cause you to need to boot into safe mode to reset the displays, I can walk you through that if needed (It's just the reset.exe contained inside the zip).

It should work if the display really supports HDMI 2.0.
 
Last edited:
Yeah, that's all kinds of wonky in the EDID. If you hit 3840x2160x30.00 entry and hit "edit," you can most likely fix it as follows:

View attachment 316504
The dropdown and refresh rate are the important entries. Then OK-out of the program and reboot. If it does not work, it may cause you to need to boot into safe mode to reset the displays, I can walk you through that if needed (It's just the reset.exe contained inside the zip).

It should work if the display really supports HDMI 2.0.
Thanks, but I find it easier to just use an AMD GPU or the Intel iGPU as they work nicely without all this messing around. I can even use an Nvidia GPU for 3D as long as the TV is connected to the iGPU.
 
Thanks, but I find it easier to just use an AMD GPU or the Intel iGPU as they work nicely without all this messing around. I can even use an Nvidia GPU for 3D as long as the TV is connected to the iGPU.
I mean it's not really that hard to edit, but if that's what you want to do, go for it. I don't mean that sarcastically. If it works, it works.

With that EDID I'm surprised even AMD works frankly, though. I hate to do this to you, but... are you sure you are rendering in full chroma? If you aren't colored text will be awful.

Does this test pattern look clear and crisp throughout, at 100% zoom?

chroma-444.png
 
Last edited:
Back
Top