• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Dont forget to turn off the Onboard GPU on your Ryzen7000X3D CPU

Joined
May 12, 2006
Messages
1,616 (0.23/day)
Location
The Gulag Casino
System Name ROG 9800X3d By purecain
Processor AMD Ryzen 7 9800X3D
Motherboard ASUS Crosshair X670E Hero
Cooling Noctua NH U12A
Memory 64Gb G.Skill Trident Z5 neo RGB 6400@6000mhz@1.41v
Video Card(s) Aorus RTX4090 Extreme Waterforce
Storage 990Pro2Tb-1TbSamsung Evo M.2/ 2TbSamsung QVO/ 1TbSamsung Evo780/ 120gbKingston Now
Display(s) LG 65UN85006LA 65" Smart 4K Ultra HD HDR LED TV
Case Thermaltake CoreX71 Limited Edition Etched Tempered Glass Door
Audio Device(s) On board/NIcomplete audio 6
Power Supply Seasonic FOCUS 1000w 80+
Mouse M65 RGB Elite
Keyboard K95 RGB Platinum
Software Windows11pro
Benchmark Scores [url=https://valid.x86.fr/gtle1y][img]https://valid.x86.fr/cache/banner/gtle1y-6.png[/img][/url]
Ive done a little reading and it turns out those of us with the little screens that run on HDMI would be fine to run AIDA64 through it for all temps and details.

Apart from that it can be simply used to avoid driver conflict when setting up a second screen.

I'm not using either and have just disabled the GPU in the bios but it wasn't easy.

Here's the scrn shots you need to find the option on X670E chipsets. I dont know why, but I could not find the gpu option for about an hour. Hope this saves others the time on Asus chipsets.

Its in Northbridge settings.



WIN_20240222_06_05_28_Pro.jpg
WIN_20240222_06_05_17_Pro.jpg

:toast:
 
I typically opt for a cute Turing Smart Screen over USB-C for those stats. If I need a 2nd screen it's going to be Miracast to my Surface. Is the 7000 series iGPU troublesome when messing with CPU clockspeed? How is the iGPU encoder? I can think of a few reasons people may want to leave it on or off. Best discretion.
 
I typically opt for a cute Turing Smart Screen over USB-C for those stats. If I need a 2nd screen it's going to be Miracast to my Surface. Is the 7000 series iGPU troublesome when messing with CPU clockspeed? How is the iGPU encoder? I can think of a few reasons people may want to leave it on or off. Best discretion.
The main issue is it requires the standard Radeon drivers. Having both Nvidia and Radeon drivers installed at the same time sucks.
 
I'm actually on the opposite opinion. Using the iGPU on my 7800X3D for my secondary screen lowers idle power consumption on my 7800 XT drastically. Besides, the CPU never comes anywhere close to its power target, so iGPU power consumption isn't an issue, either.

Edit: Ah, I see you've mentioned using it for small screens. It's too early in the morning for me to read properly, I guess.
Nevermind. :toast:
 
Last edited:
I had blue screen because of my 7800X3D iGPU drivers so i just disabled the iGPU and now everything is fine.
 
I had blue screen because of my 7800X3D iGPU drivers so i just disabled the iGPU and now everything is fine.
That's not normal. Maybe a conflict between the early AMD and Nvidia drivers?
 
i need that IGP for my two monitors.
having a 1440p 280Hz Monitor with anything else together is already enough on all of my RDNA3 GPUs to trigger the max. VRAM Clock and almost 90W of idle power consumption.
putting the second monitor on the IGPU fixes it and the GPU runs at ~6.5W at idle.
 
I typically opt for a cute Turing Smart Screen over USB-C for those stats. If I need a 2nd screen it's going to be Miracast to my Surface. Is the 7000 series iGPU troublesome when messing with CPU clockspeed? How is the iGPU encoder? I can think of a few reasons people may want to leave it on or off. Best discretion.
I basically dont use it, I didnt even look at the specs for the performance capabilities. This is from the AMD site.
Graphics Model
AMD Radeon™ Graphics
Graphics Core Count
2
Graphics Frequency
2200 MHz

I basically have no use for it. The CPU behaves the same without the onboard GPU activated but I noticed it was always consuming power so I deactivated it.
Plus I use an Nvidia card so don't really want to introduce any weird driver behaviour.
AMD should make it so that this GPU can add to the graphics horsepower of an AMD based GPU. I think they missed a chance at creating a larger customer base for their GPU's there...
The main issue is it requires the standard Radeon drivers. Having both Nvidia and Radeon drivers installed at the same time sucks.
exactly...
i need that IGP for my two monitors.
having a 1440p 280Hz Monitor with anything else together is already enough on all of my RDNA3 GPUs to trigger the max. VRAM Clock and almost 90W of idle power consumption.
putting the second monitor on the IGPU fixes it and the GPU runs at ~6.5W at idle.
Being able to use this gpu for a secondary monitor is a great idea from AMD. I still think they should of found a way to help the gpu add to the Graphics horsepower of the pc.

:toast:
 
I basically dont use it, I didnt even look at the specs for the performance capabilities. This is from the AMD site.
Graphics Model
AMD Radeon™ Graphics
Graphics Core Count
2
Graphics Frequency
2200 MHz

I basically have no use for it. The CPU behaves the same without the onboard GPU activated but I noticed it was always consuming power so I deactivated it.
Plus I use an Nvidia card so don't really want to introduce any weird driver behaviour.
AMD should make it so that this GPU can add to the graphics horsepower of an AMD based GPU. I think they missed a chance at creating a larger customer base for their GPU's there...

exactly...

Being able to use this gpu for a secondary monitor is a great idea from AMD. I still think they should of found a way to help the gpu add to the Graphics horsepower of the pc.

:toast:
It's a basic GPU, not a full performance-based G series APU. You're missing the whole point of GPU RMA, troubleshooting, at least not the extra display part.

Driver issues aren't as big as people like to lead on to be between brands. Yes they can cause issues but that's rare.

And no, "adding to the graphics horsepower" these aren't mixed crossfire days. Either take the Youtube page to the iGPU output or don't bother. Other than encode/decode there is no adding to a dedicated card.
 
I was not aware of the AMD+nVidia population device driver conflict. That's nuts. Haven't had an nVidia card since the TNT2 days but I recognize the linux community having constant gripes about nVidia specific drivers and this seems to be the baby scratch that starts it off.
I basically have no use for it. The CPU behaves the same without the onboard GPU activated but I noticed it was always consuming power so I deactivated it.
That's really strange. It shouldn't behave like that.
AMD should make it so that this GPU can add to the graphics horsepower of an AMD based GPU. I think they missed a chance at creating a larger customer base for their GPU's there...
No no, that's not how that exit works. When you have systems that get long in the tooth after a few years or you're just looking to increase the performance of an existing workstation or server, you're not typically looking for a CPU upgrade unless you started with THE most basic SKU and lost the silicon lottery on the spot. Typically an add-in card like a compute unit is going to have no video outputs but can be configured to do the heavy lifting and push it out of whatever integrated display-out of your choice like it's a whole new device. The presence of this technology pattern obsoleted an entire era of onboard video featured eWaste and we're all the better for it. I'm thankful this strategy exists or we would be sitting on way more junk than I care to admit. Most of these forums would be plagued with driver questions, I'd be sitting in a Lain room again, plus it's hilarious that I can just add 8-48GB of vram to something like a decade old Intel system with whatever hits the auction junkpile and it runs like a new workstation with zero consequence.
 
It's a basic GPU, not a full performance-based G series APU. You're missing the whole point of GPU RMA, troubleshooting, at least not the extra display part.

Driver issues aren't as big as people like to lead on to be between brands. Yes they can cause issues but that's rare.

And no, "adding to the graphics horsepower" these aren't mixed crossfire days. Either take the Youtube page to the iGPU output or don't bother. Other than encode/decode there is no adding to a dedicated card.
I get that, I thought this would be good for chat when streaming. I suppose you will save on resources.
:toast:
 
I thought this would be good for chat when streaming.
I've been thinking about that a lot lately. I'm always picking very LARGE cards that aren't Sapphire or Gigabyte. I've been on a PowerColor kick for the past decade and have a HMD that utilizes HDMI instead of DP or USB-C. Plus I haven't been building systems with CPUs that feature an iGPU because that directly conflicts with my hardware purity+performance floor philosophy. Basically if I'm going to build something, the components need to be good at what they're doing in dedicated utility instead of sharing resources and stretching memory across this and that. Also I should be choosing components no worse than my current best as a sort of hardware or feature progression (this is the part that is actually difficult).

We don't really live in that kind of world anymore and I've had a time deciding on a capstone for this AM4 system. I could always pick a 5950X or 5800X3D and just be done with it but when it comes to having options for streaming, I really should have more than one monitor and that means having more than one HDMI out. Converters are always available for DP-HDMI but they're usually terrible just like the mess of bad USB and HDMI cables flooding the market. It just seems like a situation where I would be better off picking a 8" TFT display and feeding it with an integrated option from a 5700G or something.

It sounds like an excellent plan. Moving up +2c/+4t, moving up a generation of performance closer to 1:1 and really starts to matter, getting a huge clockspeed boost, retaining the same thermal profile and obtaining integrated HDMI+DP for what would essentially just put me out ~$70. Is that a good idea? No. The better answer is to explore a realm bigger than 1080p144 with a newer display that is already configured to accept DP.
 
That's not normal. Maybe a conflict between the early AMD and Nvidia drivers?
I had a problem as well between my 4750g and 4060 (low profile) although I don't recall how I fixed it.
Being able to use this gpu for a secondary monitor is a great idea from AMD. I still think they should of found a way to help the gpu add to the Graphics horsepower of the pc.
Windows 10 has a setting when running applications to choose which GPU to use by default then for applications you can choose to override. Theoretically if you have two monitors you can run one game on the iGPU and the other game on the dGPU, or game on one but setup streaming on the other, etc...
 
Ive done a little reading and it turns out those of us with the little screens that run on HDMI would be fine to run AIDA64 through it for all temps and details.

Apart from that it can be simply used to avoid driver conflict when setting up a second screen.

I'm not using either and have just disabled the GPU in the bios but it wasn't easy.

Here's the scrn shots you need to find the option on X670E chipsets. I dont know why, but I could not find the gpu option for about an hour. Hope this saves others the time on Asus chipsets.

Its in Northbridge settings.



View attachment 335818View attachment 335819

:toast:

If you have any experience with bios, then it's easy to find and disable.

But honestly they should have made a version of the 7800x3d without igpu... it's a waste of die space on a cpu that will 100% guaranteed be used with a dedicated gpu.
 
The last Intel part that I ran had an IGPU, but I disabled it in bios because I wouldn’t use it. I would imagine AMD users would do the same thing if they are hunting for MHz and lower C’s.
 
@Toothless
anyone here/doing basic troubleshooting should have a cheap spare.
not even 30$
hdmi and dvi

and that's ignoring that the "next" cpu (and so on) might not even come with it,
or that it would require bios reset (to enable it), which i couldnt have done on my last GB x570,
as it wouldnt run the Gskill ram past jedec, unless i used a different kit (corsair), to pre-set clocks/timings/volt,
before shutdown and swapping kits.

or giving up the space for 2 moni connections i will not use (short of trouble),
that can be used for multiple usb ports (i/others WILL more likely use).

i will never limit myself/others to a cpu that has a igpu, just to avoid spending ~30$ on a dgpu,
especially with folks that have spend +600$ on CPU/MB?ram..
 
Windows 10 has a setting when running applications to choose which GPU to use by default then for applications you can choose to override. Theoretically if you have two monitors you can run one game on the iGPU and the other game on the dGPU, or game on one but setup streaming on the other, etc...
Where is that setting? I'd love to crunch on my iGPU while gaming on the dedicated one.
 
Where is that setting? I'd love to crunch on my iGPU while gaming on the dedicated one.
Firstly in order to get this to work I had to set my 4750G iGPU as the default graphics adaptor in UEFI/BIOS. This fixes the problem with driver in Device Manager with the iGPU.

Check Device Manager first to make sure both adaptors don't have a conflict. (yellow triangle).
Then go to Graphics Settings, add your apps, then configure each one.

I forgot at the moment where the global setting is for the default adaptor for the system but when I google it seems in Nvidia Control panel you can set your default adaptor.

1708645164449.png


Here is an example you should be able to confirm based on GPU activity if your settings were successful.
Below on the left is when I set my game to 4060 on the right is when I set my game to iGPU.

An interesting observation I discovered the iGPU output can go out over the Nvidia display output and vice versa.
Depending on what screen your application or game is output on the other display it will incur some activity on both GPU and iGPU.
If you play the app or game on the monitor that matches the GPU or iGPU you are using then the other adaptor activity will drop to zero.
So to optimize your graphics usage put the app or game onto the monitor that adaptor is connected to.

In the example below I am using 4060 on the left on the same monitor that is connected to it so the iGPU activity is zero.
On the right I am using the iGPU on the monitor connected to the 4060 so the 4060 is still doing something in regards to displaying the video output from the iGPU.

1708648394960.png
1708648539327.png
 
Last edited:
Using an iGPU can save VRAM, as iGPU's typically use system RAM. They easily fast enough for desktop graphics acceleration. So I prefer to keep my iGPU on now.

I also think NVENC is nowhere near as good as hyped up to be, the intel encoding performance (and decoding) is much better.

Dont really have experience of AMD iGPU encoding performance though. But wouldnt surprise me if thats also better than NVENC.
 
@chrcoluk
maybe gotten a bit better "lately", probably with low res stuff,
at least for the FHD/UHD stuff i did in the last few years (clips/movie scenes for work, drone videos) on common sw,
(Nv) dgpu still performs better.
 
Last edited:
Firstly in order to get this to work I had to set my 4750G iGPU as the default graphics adaptor in UEFI/BIOS. This fixes the problem with driver in Device Manager with the iGPU.

Check Device Manager first to make sure both adaptors don't have a conflict. (yellow triangle).
Then go to Graphics Settings, add your apps, then configure each one.

I forgot at the moment where the global setting is for the default adaptor for the system but when I google it seems in Nvidia Control panel you can set your default adaptor.

View attachment 335945

Here is an example you should be able to confirm based on GPU activity if your settings were successful.
Below on the left is when I set my game to 4060 on the right is when I set my game to iGPU.

An interesting observation I discovered the iGPU output can go out over the Nvidia display output and vice versa.
Depending on what screen your application or game is output on the other display it will incur some activity on both GPU and iGPU.
If you play the app or game on the monitor that matches the GPU or iGPU you are using then the other adaptor activity will drop to zero.
So to optimize your graphics usage put the app or game onto the monitor that adaptor is connected to.

In the example below I am using 4060 on the left on the same monitor that is connected to it so the iGPU activity is zero.
On the right I am using the iGPU on the monitor connected to the 4060 so the 4060 is still doing something in regards to displaying the video output from the iGPU.

View attachment 335959View attachment 335960
I just set up Chrome and VLC to run on the iGPU, but they still load my 7800 XT for some reason. :(
 
I just set up Chrome and VLC to run on the iGPU, but they still load my 7800 XT for some reason. :(
Did you see my note about monitors? If the app is on the opposite monitor of it's assigned GPU/iGPU you will get load on both. Try one app at a time and move them to the monitor the iGPU is connected to.
 
I typically opt for a cute Turing Smart Screen over USB-C for those stats. If I need a 2nd screen it's going to be Miracast to my Surface. Is the 7000 series iGPU troublesome when messing with CPU clockspeed? How is the iGPU encoder? I can think of a few reasons people may want to leave it on or off. Best discretion.
You know Ive wanted to make those Turing screens for over 10yrs. Since the 5inch lcd HD displays became available. I ended up getting wound up in politics and have never gotten around to buying the 3d printer to manufacture the cases.
I missed a beat with this one. Ive just bought the 5inch. Thanks for the head up!!!!

:toast:
 
I know the 8000G APUs support AFMF; do the 7000s' IGP support AFMF (yet)?

If it (eventually) does, It'd be quite handy to leave the IGP enabled for us 'budget gamers', still on older 'beefy GPUs'.
Windows 11 makes it easy to tell which GPU renders, one'd just be limited to the Output spec of the mobo.

You'd think 'frame pass-thru, over PCI-e' would hurt performance but,
I recently had a RX580X perform (reliably) WORSE (in Avatar:FoP) when directly displaying off that card, vs. pass thru to the 6500XT (over x570 chipset too).
I know that makes 0 sense; I re-ran tests several times and re-installed drivers(cleanly) twice.
 
Last edited:
Back
Top