• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD "Strix Halo" Zen 5 Mobile Processor Pictured: Chiplet-based, Uses 256-bit LPDDR5X

Thanks for invalidating a metric butt-ton of otherwise valid experiences, mine included.


Yeah, some maybe. Not all. And that's all it takes.

*Points at build*
So wait, your 7900XTX Linux build is proof that AMD's Windows drivers are crappy?
 
Typically AMD combines very few GPU CUs to its high core count SKUs as its assumed it will be coupled with a discrete GPU.
Considering the fact that a homogeneous SoC should be more power and cost-efficient than a CPU and a GPU connected through a PCIe bus each with their own memory, I wonder why AMD didn't do this sooner.
My wild guess was that developing such chips could put new PCs closer to the performance/cost of consoles, so Sony and/or Microsoft got AMD to sign a clause preventing it from releasing a high-performance SoC for X amount of years. Akin to Samsung's contract for the Xclipse RDNA GPUs having a clause that prevents AMD from releasing SoCs that work below 5W.



The i7-8809G doesn't count? It even had a nugget of HBM on the chip

There were so many things wrong with Kaby Lake G.
  • It released in 2018 using the old 4-core Kaby Lake CPU from early 2017, already after Intel had released 6-core Coffee Lake mobile CPUs
  • They called it a Vega GPU when in reality it used the ISA of a Polaris GPU, so no improved geometry processing, no rapid-packed-math, etc.
  • Intel charged way too much for it, to the point that it was much cheaper to get laptops with a Core i5 + GTX1050 which also got better performance overall.
  • Very large bandwidth (200GB/s) couldn't be taken advantage of with only 4GB available and using such a small 24CU GPU @ 1GHz.

In the end it was just bad design and planning. They launched a premium product with old silicon.


The bad AMD driver quality misinformation is an internet myth perpetuated by bad players.

Not just bad players. I have a friend who perpetuates that myth because he had a bad experience with an AMD GPU... in 2007.
 
So wait, your 7900XTX Linux build is proof that AMD's Windows drivers are crappy?
I just switched to Linux this weekend because open source drivers are superior to the windows ones, yes.

I was dual booting until literally yesterday.

And I wouldn't use the word "crap." But there is a difference. It's been improving yes, but I am impatient. :laugh:
 
Thanks for invalidating a metric butt-ton of otherwise valid experiences, mine included.
IMO, the thing with the "bad AMD driver myth" is not that it's a myth, but that it's old. I had a buttload of problems with the driver on the 5700 XT, but ever since RDNA 2, my experience has been rock solid.
 
ok, an APU with the iGPU that could be comparable to the 4060M and 4070M and still fricking have 8 lanes of PCIe 5.0 for the discrete GPU, just wtf
 
IMO, the thing with the "bad AMD driver myth" is not that it's a myth, but that it's old. I had a buttload of problems with the driver on the 5700 XT, but ever since RDNA 2, my experience has been rock solid.
I mean, for the "crap" version of it I agree 100%. AMD drivers ceased to be "crap" long ago. But I still feel there is a ton of work to be done on things like cpu overhead. Maybe just me, but its my experience anyways.
 
Even with that the hit is massive, generally speaking linux builds score higher across the board in CPU benches when comparing (same) hardware.
Linux is indeed faster for quite a few workloads. However, the article doesn't say if virtualization based security was enabled for Windows 11.
Security Details- Windows 11: __user pointer sanitization: Disabled + Retpoline: Full + IBPB: Always + IBRS: Enabled + STIBP: Enabled + VBS: Disabled- Ubuntu 23.10: gather_data_sampling: Not affected + itlb_multihit: Not affected + l1tf: Not affected + mds: Not affected + meltdown: Not affected + mmio_stale_data: Not affected + retbleed: Not affected + spec_rstack_overflow: Vulnerable: Safe RET no microcode + spec_store_bypass: Mitigation of SSB disabled via prctl + spectre_v1: Mitigation of usercopy/swapgs barriers and __user pointer sanitization + spectre_v2: Mitigation of Enhanced / Automatic IBRS IBPB: conditional STIBP: always-on RSB filling PBRSB-eIBRS: Not affected + srbds: Not affected + tsx_async_abort: Not affected - Ubuntu 24.04: gather_data_sampling: Not affected + itlb_multihit: Not affected + l1tf: Not affected + mds: Not affected + meltdown: Not affected + mmio_stale_data: Not affected + reg_file_data_sampling: Not affected + retbleed: Not affected + spec_rstack_overflow: Vulnerable: Safe RET no microcode + spec_store_bypass: Mitigation of SSB disabled via prctl + spectre_v1: Mitigation of usercopy/swapgs barriers and __user pointer sanitization + spectre_v2: Mitigation of Enhanced / Automatic IBRS IBPB: conditional STIBP: always-on RSB filling PBRSB-eIBRS: Not affected + srbds: Not affected + tsx_async_abort: Not affected

I've always speculated a lot of that is down to the number of useless services always enabled/required on Windows. But in recent years there's also generally better schedulers available on linux as well which helps them a lot, especially post Android.
 
I mean, for the "crap" version of it I agree 100%. AMD drivers ceased to be "crap" long ago. But I still feel there is a ton of work to be done on things like cpu overhead. Maybe just me, but its my experience anyways.
I thought Nvidia drivers have been proven to have a lot more CPU overhead. Anyway, I don't have this problem.

At this point, my only problem is the unreasonably high video playback power consumption on RDNA 3, but I don't think that can be improved with drivers, unfortunately.
 
One thing of note is that with a 256 bit memory bus: with the iGPU inactive, the CPU may have access to much higher memory bandwidth than desktop processors. I wonder if there will be cases where Strix Halo can outperform the desktop Granite Ridge
In theory, but that much bandwidth will be unnecessary for CPU-only, at least at a 16C ceiling. You might see a boost in benches, but everything else would feel the same.
 
I thought Nvidia drivers have been proven to have a lot more CPU overhead.
I should clarify: It's nuanced. They do, but only in dx12 and vulkan. AMD has always struggled more in overhead in DX11 and OpenGL, but its getting better.

I play mostly indie titles so...
 
IMO, the thing with the "bad AMD driver myth" is not that it's a myth, but that it's old. I had a buttload of problems with the driver on the 5700 XT, but ever since RDNA 2, my experience has been rock solid.

This is rock solid?

I just had a look... with the monitor off, board power is around 8-9 W, which is awesome. Now, with the monitor back on, VRAM clock jumps straight to 909 MHz, and power consumption to 45 W. It's funny that it consumes less power as I'm typing this comment in Chrome than it does doing nothing on the Windows desktop, as Chrome lets the VRAM clock fluctuate a bit, while it's always 909 MHz on the desktop.

It was fine for a long time. I really don't understand what happened with these 24.x.x driver versions. I also don't get why reverting to an older version doesn't work. :(

Update:

I had enough of the issue, and put my 6500 XT into the system. It was fine for a while, but then I noticed that the VRAM clock was stuck at 1057 MHz during idle. I don't know by how much that bumped the idle power up, as RDNA 2 only reports GPU chip power draw, not the total board power. So then, I thought, whatever, I'll just live with this, and put the 7800 XT back. And now, it's fine(-ish). VRAM clock goes back into the 90-100 MHz range at idle, and power is at 24 W. What... the... hell? :wtf:

I guess you just have a different definition. :shrug:

My issue is that as long a people won't admit there are problems, AMD has no incentive to fix their Windows drivers.
 
I thought Nvidia drivers have been proven to have a lot more CPU overhead. Anyway, I don't have this problem.

At this point, my only problem is the unreasonably high video playback power consumption on RDNA 3, but I don't think that can be improved with drivers, unfortunately.
Sometimes the high power draw is caused by running the VRAM full-speed. If that's the case, it can certainly be addressed by a driver update. The only thing is, AMD has fixed power draw many times before only to regress it a few driver releases down the road. Nvidia isn't safe from this either, but AMD seems to regress more often.
 
IMO, the thing with the "bad AMD driver myth" is not that it's a myth, but that it's old. I had a buttload of problems with the driver on the 5700 XT, but ever since RDNA 2, my experience has been rock solid.
Well, I've been running into several ERROR_GFX_STATE messages in RDR2 now. Strangely, only from chapter 4 onwards. Its not frequent enough to care about, its also not tied to a specific sequence in the game.

That said, Nvidia wasn't trouble free either, but these messages need to not happen too often, or its gonna be a short ride on RDNA3. If big titles like this can't run stable entirely, meh. Though it is known Rockstar didn't really give RDR2 PC that much aftercare either. Benefit of the doubt.
 
After ~20 years of AMD's dreaming about project Fusion, maybe It will be reality???
But I can imagine that, they will fck up the price and energy management running that 256bit wide memory bus on light load....
 
In theory, but that much bandwidth will be unnecessary for CPU-only, at least at a 16C ceiling. You might see a boost in benches, but everything else would feel the same.
Moreover, if it is anything like Phoenix, then the CPU complex might not have a wide enough link to the memory controller to use that bandwidth. The cores themselves are capable of using that bandwidth, but I doubt that they would be allowed to access even 50% of it.
 
I've always wondered about this statement. I ran Nvida cards for years until they pulled that Geforce partner program, when I switched to AMD (also as much for the price/performance ratio, I can't afford $1500-2k for a GPU). I can't think of a single bug which was really a show stopper with either of them. Most annoying problem I ever had was the power draw with dual monitors, and eventually that got fixed.
There's enough anecdotes in either direction. I could go on about flickering, mouse corruption, hardware acceleration incompatibilites, and the need to run certain drivers with certain games over the multiple chances I gave radeon cards a try. Ironically they worked better in games than out of them, which you'd think would be the thing to get right first. It doesn't matter much now though, since even if the drivers were bug free today, sadly nvidia is the only choice for AI stuff I do.
Sometimes I wonder what it would have been like if AMD and nvidia merged as was the original attempt before AMD settled for ATI upon Jensun wanting more control than AMD was willing to give.
 
Linux is indeed faster for quite a few workloads. However, the article doesn't say if virtualization based security was enabled for Windows 11.
I'd argue that if Linux is not faster than Windows 11 by now, then Linux is a lost cause on the desktop. But the lack of software compatibility and extensive deep knowledge of the inner workings of Linux just to install a small app, let alone a driver is what kills Linux for the general user, and always will. Linux can't break out of its niche because of the people who make it. I seriously would not know what to do with my computer, beyond internet browsing if I started it up and booted Linux.
 
Sometimes I wonder what it would have been like if AMD and nvidia merged as was the original attempt before AMD settled for ATI upon Jensun wanting more control than AMD was willing to give.
Intel would've released 8 core SB then & killed both of them in one fell swoop :nutkick:
 
Sometimes I wonder what it would have been like if AMD and nvidia merged as was the original attempt before AMD settled for ATI upon Jensun wanting more control than AMD was willing to give.
3DFX ring a bell? They were better than nGreedia, and were killed because of it.
 
I'd argue that if Linux is not faster than Windows 11 by now, then Linux is a lost cause on the desktop. But the lack of software compatibility and extensive deep knowledge of the inner workings of Linux just to install a small app, let alone a driver is what kills Linux for the general user, and always will. Linux can't break out of its niche because of the people who make it. I seriously would not know what to do with my computer, beyond internet browsing if I started it up and booted Linux.
At this point of dominance Windows marketshare can only go down, and the only question is to what limit and at what speed. But, honestly the web has taken over, with so many things web apps or electron apps these days, that if you wanted to only use a browser on linux, that's most of people's activites now anyway, which is great for choosing the right OS.

Somehow valve managed to make the steam deck a success despite putting Arch of all distros on it, so it's not as bad as you'd think!

As a developer though, I noticed a funny thing about Windows... the I/O is such that if a program is ported in a naive way, though it may work, it will have worse performance than on Linux. "stat" is a fast command on linux, but not so on Windows. I just do not do any JS development on windows anymore for example because of how attrocious the tools like npm and webpack handle thousands of tiny files. And text searching them is no better.
 
Back
Top