• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AM4 and Combining Different RAM?

since i just did a clean install and hadnt run any benches yet, 1080P 3DMark11.

3200 3600
results
14602 14725
13763 13857
25262 26261
15638 15697
fps
74 75
70 70
70 71
40 41
80 83
73 73

while its not a huge difference, its there, and that's on a card slower than 3060Ti, and not even using things like rebar, where i doubt having 5-7GB/s less, wont matter.
heck just even looking at kits, 3600C18 runs at least at the same level as 3200C14, while usually same/cheaper, and definitely more common to be found in shop than 3200C14.

part why i always stick to getting ram that allows for 1:1 with bus clock, and the lowest timings buyer can afford, unless they dont just game, and can actually name 2 things
they do, that are known to use more ram (than they need for gaming).
1 (one!) FPS on average? That's well within margin of error.
 
Two possible reasons:
The apps don't benefit from a jump to AM5 era CPUs+memory and there's no foreseeable updates that would make one reconsider.
The preferred offerings on AM5 boards still aren't financially sound or perpetually MIA/OoS.

I'm still on AM4 and staying here for a LOOOONG time. There's maybe one or two things where AM5 would be a benefit but I won't see it for years anyway.
It's a pattern and there are others that can see past the pattern as well. Making decisions like this is perfectly fine. Not everybody needs the new thing.
It was the same argument from ddr1 to 2. 2 to 3. 3 to ddr4, not the same speech ddr4 to ddr5.

Sure there's an overlap there.

But in the mean time.... save the money. I didn't say go spend money right this second or anything...
 
It was the same argument from ddr1 to 2. 2 to 3. 3 to ddr4, not the same speech ddr4 to ddr5.
I've been through this noise since EDO was still relevant and started when making the jump from SDR to DDR.
These were all their own standards on each platform. I skipped RAMBUS and DDR2.
Went from the worst DDR kit to the worst DDR3 kit. Needed experience with Crucial and G.Skill to get out.

Ryzen and DDR4 appealed to me during that golden era of Micron E-die and I don't regret choosing it.
Can you say the same about any DDR5 kits? Didn't think so. I think many of us are still waiting for the moment.
 
It was the same argument from ddr1 to 2. 2 to 3. 3 to ddr4, not the same speech ddr4 to ddr5.

Sure there's an overlap there.

But in the mean time.... save the money. I didn't say go spend money right this second or anything...
As far as I've seen, the old standard usually becomes obsolete when the one after the new one gets released. Basically, DDR2 got obsolete when DDR4 dropped, so DDR4 will probably get old when DDR6 is out.

I've been through this noise since EDO was still relevant and started when making the jump from SDR to DDR.
These were all their own standards on each platform. I skipped RAMBUS and DDR2.
Went from the worst DDR kit to the worst DDR3 kit. Needed experience with Crucial and G.Skill to get out.

Ryzen and DDR4 appealed to me during that golden era of Micron E-die and I don't regret choosing it.
Can you say the same about any DDR5 kits? Didn't think so. I think many of us are still waiting for the moment.
I'm fine with DDR5. I wouldn't say it's night and day difference, especially with an X3D CPU that just doesn't care about RAM speed, but it's nice anyway. I wouldn't recommend ditching a Zen 3 platform for it, though.
 
@AusWolf
i repeated it 3 times each, ignoring the 3200 runs were technically better off (0.3 MHz higher cpu base clock), and started with a colder loop.
and while low, its not nothing, and that's on older gpu and an old bench, and not accounting for things like rebar, where bus/ram speed will make much bigger impact.

in short, for almost all gaming, having highest clocks that allow for 1:1 (ram/bus), is better than more ram (above 16gb).
 
@AusWolf
i repeated it 3 times each, ignoring the 3200 runs were technically better off (0.3 MHz higher cpu base clock), and started with a colder loop.
and while low, its not nothing, and that's on older gpu and an old bench, and not accounting for things like rebar, where bus/ram speed will make much bigger impact.

in short, for almost all gaming, having highest clocks that allow for 1:1 (ram/bus), is better than more ram (above 16gb).
Even if that 1 FPS difference is consistent across several runs, it's still only 1 FPS. Don't tell me you feel that in a live game.
 
I've been through this noise since EDO was still relevant and started when making the jump from SDR to DDR.
These were all their own standards on each platform. I skipped RAMBUS and DDR2.
Went from the worst DDR kit to the worst DDR3 kit. Needed experience with Crucial and G.Skill to get out.

Ryzen and DDR4 appealed to me during that golden era of Micron E-die and I don't regret choosing it.
Can you say the same about any DDR5 kits? Didn't think so. I think many of us are still waiting for the moment.
Say the same about ddr5 kits??

Of course I can. Release of DDR5 was low frequency, way loose timings and very high prices...
 
OF COURSE IT'S OBVIOUS that no one in 2025 will "close browser and other stuff" just to play game!:oops::rolleyes::kookoo:
So, 16 GB is "on the edge" for any average game, unless it's Solitaire.
Modern games don't need a lot of RAM and shouldn't need you to close browser windows unless you have a real tab-management problem. Most AAA games are designed to run on 16GB consoles so they have a 3-6GB RAM footprint to free up 10GB+ of the shared 16GB for graphics duty. If you watch any of the thousands of youtube videos showing the MSI afterburner overlay, you'll typically see the system using well under 16GB RAM even while screen recording is active.

16GB is enough for a modern AAA game, Windows, a few browser windows, music and voice/chat - you only need more than 16GB RAM if you're running a high-end GPU at max settings and have a bunch of other stuff that would make your 16GB system low on RAM before you even opened the game.

while its not a huge difference, its there, and that's on a card slower than 3060Ti, and not even using things like rebar, where i doubt having 5-7GB/s less, wont matter.
heck just even looking at kits, 3600C18 runs at least at the same level as 3200C14, while usually same/cheaper, and definitely more common to be found in shop than 3200C14.
3200 C14 is on the expensive, premium side of 3200 RAM, the cheap 3200 kits similar in price to 3600 CL18 were all CL16 over here.

Absolute latency in ns is the most important factor for DDR4 on Zen3, but if two latencies are the same, then the RAM with the higher clock wins - not because of the bandwidth, but because the CPU's infinity fabric is running faster.

AMD always said 3600 was the performance sweet spot beyond which you were looking at very marginal gains for increasingly exotic and pricey RAM. For most of AM4's lifespan, 3200 was the best performance-per-dollar at ~95% or more of the performance of 3600 and typically a significantly (10-25%) lower cost.

The preferred offerings on AM5 boards still aren't financially sound or perpetually MIA/OoS.
I'm almost done with an upgrade at home from 5800X3D to 9800X3D. Grabbed an AM5 board from work and ordered some fast 6000 CL28 Flare X5 ready for when the 9800X3D comes down to its MSRP (or below). It's been at scalper prices for at least the last 3+ months and is finally appearing in stock at MSRP.

As for gaming performance, nothing apart from the 7800X3D or 9800X3D are enough of an improvement over a 5700X3D to merit spending all that extra money on a new motherboard and new RAM. The 9700X is a good all-rounder at $315 or so, but it's ~$550 if you also have to buy a new board and DDR5, compared to the ~$200 5700X3Ds have been selling for recently. Yes, the 9700X is a bit faster than a 5700X3D, but it's $200 vs $550 and they're often neck-and-neck in gaming (only) performance.

AM4 is AMD's strongest compeition at this point. It's been around for so long and there are so many good deals on new boards, plenty of decent used bundles on ebay - and still good brand-new CPUs coming out of AMD to keep the platform relevant. I suspect the 5700X3D is the last product we'll see on AM4, but even as a dead-end platform, it's valid. It's not like Intel Core Ultra 285K has any future either, LGA1851 is looking to be a single-generation platform that will never get any faster CPUs, so it's unfair to call AM4 a bad investment in 2025 when you look at the competition's paltry platform life.
 
Last edited:
@AusWolf
never did.
but its also not a game, but a bench, and just a single one on older hw as well, and i just doubt, that nowadays there is no impact by having 200 (or more) MHz less on everything else,
all to worry about another 10$ spent on 3600C18, instead of any 3200 kit, (short of "b-die" kits to run at 3600).

since 3xxx series came out, anytime i looked up parts for builds (US/EU) or help others, 3600C18/20 was not much more, never had to take shortcuts somewhere else, to stay within budget.
and even if it was sometimes say ~20$ (shortage/specific brand/model), its not like thats gonna buy you the next better gpu (new/upgrade).

and not even looking at the "big picture", counting all the gains like cpu LC for better boost/lower case+gpu temps, gpu with factory oc/"better" cooling,
and things like 3600 ram, always got me about 5-10% better overall perf out of every build, compared to any review/test done with "stock/generic" stuff.
 
I tried to swap 3x8 GB 2400 to 2x16 GB 3200 with cheapo GPU and i7-8700. Pure BS idea, NO POINT. 24 was WAY BETTER.

So just how is 3x8gb 2400 "way better" than 2x16gb 3200 ? im truly baffled how you come to this conclusion.
 
Last edited:
So just how is 3x8gb 2400 "way better" than 2x16gb 3200 ? im truly baffled how you come to this conclusion.
Its way NOT better haha. It would be running single channel mode 3 sticks of ram. Total loss of performance. Check the QVL....
 
@AusWolf
never did.
but its also not a game, but a bench, and just a single one on older hw as well, and i just doubt, that nowadays there is no impact by having 200 (or more) MHz less on everything else,
all to worry about another 10$ spent on 3600C18, instead of any 3200 kit, (short of "b-die" kits to run at 3600).

since 3xxx series came out, anytime i looked up parts for builds (US/EU) or help others, 3600C18/20 was not much more, never had to take shortcuts somewhere else, to stay within budget.
and even if it was sometimes say ~20$ (shortage/specific brand/model), its not like thats gonna buy you the next better gpu (new/upgrade).

and not even looking at the "big picture", counting all the gains like cpu LC for better boost/lower case+gpu temps, gpu with factory oc/"better" cooling,
and things like 3600 ram, always got me about 5-10% better overall perf out of every build, compared to any review/test done with "stock/generic" stuff.
I think those benchmark runs are representative of a game running in a GPU-limited scenario. Your RAM speed doesn't matter then. In that case, I can spend that $20 on a large pizza to enjoy with the missus while watching a film.
 
I didn't expect this to turn into 3 pages, but maybe I should have known better!

I went with AM4 because it's a well-known platform at this point. It should be very mature, and it's the best bang for the buck. Just going up even to the 7600X would have been a 25% cost increase, and I wouldn't get 25% more performance. Then consider I was coming from an Ivy Bridge E platform, so even a 5600X is a massive single core and multicore boost, all while using considerably less power. I wanted something that was low-maintenance, modern, and didn't consume a ton of power. I also gained things like ReBar, PCIe 4, and native NVME support. Being able to make a decent 1440p system that's powered by a 500W PSU was part of my aim, and it sure feels like mission accomplished. I think once I started going beyond what I got, I'd be spending a LOT more money for increasingly diminishing returns.

Could I have gone with AM5? Sure, but I suspect I will skip AM5 entirely based on my upgrade behavior. I wasn't looking to spend a ton, and for what I did spend, I got a substantial gain in performance for a relatively small amount of money.

As for RAM usage, I've been observing performance stats on Forbidden West at 1440p. It consumes 6-7GB of RAM, but will gladly use up to 11GB of the 6700XT's 12GB capacity. I'll give Hogwarts Legacy a try next, but I suspect it will be similar since my build exceeds the recommended specs.
 
I went with AM4 because it's a well-known platform at this point. It should be very mature, and it's the best bang for the buck. Just going up even to the 7600X would have been a 25% cost increase, and I wouldn't get 25% more performance.

Nice, have you been on Ivy-E all this time, or are we talking about one of many systems?

It really is hard to argue the value of 5600X and DDR4 when you're coming from such an old baseline - the performance difference between Zen3 and Zen4 isn't really worth talking about when you're concerned about the difference between Ivy Bridge and Zen3/Zen4.

I also think platform longevity differences between AM4 and AM5 are meaningless when we're talking about someone who is going to use a single configuration for 14 years - your platform is going to be unsupported and "dead-end" for the vast majority of its lifespan, regardless of how many years of platform support it still has at the time of purchase!
 
Nice, have you been on Ivy-E all this time, or are we talking about one of many systems?

It really is hard to argue the value of 5600X and DDR4 when you're coming from such an old baseline - the performance difference between Zen3 and Zen4 isn't really worth talking about when you're concerned about the difference between Ivy Bridge and Zen3/Zen4.

I also think platform longevity differences between AM4 and AM5 are meaningless when we're talking about someone who is going to use a single configuration for 14 years - your platform is going to be unsupported and "dead-end" for the vast majority of its lifespan, regardless of how many years of platform support it still has at the time of purchase!
I've been using Ivy E for a few years when I picked gaming back up. So I shopped deals and found a board, 32GB RAM, 8-core cpu, and 5600XT for about $150. None of that was fast by today's standards, but it played most games just fine until I bought a few demanding titles.

I had a 2700X and 5700XT several years ago, but I sold it all when I took an extended break from gaming. I use macOS a lot, and for many years, my DD was a dual CPU 2010 Mac Pro with an RX 480, using OpenCore to run Monterey (I still have this machine). I'm typing this on a Mac Pro 2013, using OpenCore to run Ventura. My primary demands outside of gaming is photo editing, which really isn't all that demanding unless you want to use AI noise reduction and sharpening, which I rarely do. I just prefer macOS for personal use, some because the whole family has iPhones and iPads (makes sharing and managing easier), and some because I use Windows all day at work and like the change of pace. It's not that I can't afford newer stuff, but I just don't see the point in spending huge amounts of money for a slightly better experience.

I did start Hogwarts Legacy last night, and the 5600X and 7600XT handle it just fine. 60FPS and the game look plenty attractive for my eyes using a mix of FSR and mid/high settings. I can get behind the idea of RT in principle, but certainly not at the current cost/benefit. Maybe in another 10 years?
 
Its way NOT better haha. It would be running single channel mode 3 sticks of ram. Total loss of performance. Check the QVL....
oh yeah, and CPU-Z will show you "dual" just for you won't cry, right?

So just how is 3x8gb 2400 "way better" than 2x16gb 3200 ? im truly baffled how you come to this conclusion.
because VOLUME matters... sometimes...;):D

Modern games don't need a lot of RAM and shouldn't need you to close browser windows unless you have a real tab-management problem. Most AAA games are designed to run on 16GB consoles so they have a 3-6GB RAM footprint to free up 10GB+ of the shared 16GB for graphics duty. If you watch any of the thousands of youtube videos showing the MSI afterburner overlay, you'll typically see the system using well under 16GB RAM even while screen recording is active.

16GB is enough for a modern AAA game, Windows, a few browser windows, music and voice/chat - you only need more than 16GB RAM if you're running a high-end GPU at max settings and have a bunch of other stuff that would make your 16GB system low on RAM before you even opened the game.


3200 C14 is on the expensive, premium side of 3200 RAM, the cheap 3200 kits similar in price to 3600 CL18 were all CL16 over here.

Absolute latency in ns is the most important factor for DDR4 on Zen3, but if two latencies are the same, then the RAM with the higher clock wins - not because of the bandwidth, but because the CPU's infinity fabric is running faster.

AMD always said 3600 was the performance sweet spot beyond which you were looking at very marginal gains for increasingly exotic and pricey RAM. For most of AM4's lifespan, 3200 was the best performance-per-dollar at ~95% or more of the performance of 3600 and typically a significantly (10-25%) lower cost.


I'm almost done with an upgrade at home from 5800X3D to 9800X3D. Grabbed an AM5 board from work and ordered some fast 6000 CL28 Flare X5 ready for when the 9800X3D comes down to its MSRP (or below). It's been at scalper prices for at least the last 3+ months and is finally appearing in stock at MSRP.

As for gaming performance, nothing apart from the 7800X3D or 9800X3D are enough of an improvement over a 5700X3D to merit spending all that extra money on a new motherboard and new RAM. The 9700X is a good all-rounder at $315 or so, but it's ~$550 if you also have to buy a new board and DDR5, compared to the ~$200 5700X3Ds have been selling for recently. Yes, the 9700X is a bit faster than a 5700X3D, but it's $200 vs $550 and they're often neck-and-neck in gaming (only) performance.

AM4 is AMD's strongest compeition at this point. It's been around for so long and there are so many good deals on new boards, plenty of decent used bundles on ebay - and still good brand-new CPUs coming out of AMD to keep the platform relevant. I suspect the 5700X3D is the last product we'll see on AM4, but even as a dead-end platform, it's valid. It's not like Intel Core Ultra 285K has any future either, LGA1851 is looking to be a single-generation platform that will never get any faster CPUs, so it's unfair to call AM4 a bad investment in 2025 when you look at the competition's paltry platform life.
oh yeah, another "console theory". In console, ya average Linux-BS runs on a 64 MB of RAM, Win 10 needs 4 GB to run ITSELF properly, Win 11 way more! Running game with 90+% RAM used IS NOT "OK".:rolleyes::oops:
 
I didn't expect this to turn into 3 pages, but maybe I should have known better!
Jinza when the truth was uncovered
I went with AM4 because it's a well-known platform at this point. It should be very mature, and it's the best bang for the buck. Just going up even to the 7600X would have been a 25% cost increase, and I wouldn't get 25% more performance. Then consider I was coming from an Ivy Bridge E platform, so even a 5600X is a massive single core and multicore boost, all while using considerably less power. I wanted something that was low-maintenance, modern, and didn't consume a ton of power. I also gained things like ReBar, PCIe 4, and native NVME support. Being able to make a decent 1440p system that's powered by a 500W PSU was part of my aim, and it sure feels like mission accomplished. I think once I started going beyond what I got, I'd be spending a LOT more money for increasingly diminishing returns.
Kailash, when it rises
As for RAM usage, I've been observing performance stats on Forbidden West at 1440p. It consumes 6-7GB of RAM, but will gladly use up to 11GB of the 6700XT's 12GB capacity. I'll give Hogwarts Legacy a try next, but I suspect it will be similar since my build exceeds the recommended specs.
Arnok on the night of his joining
 
because VOLUME matters... sometimes...;):D
in my book faster dual channel ram 32gb is "more volume" than 24gb of single channel ram,, u said it was and i quote WAY BETTER , it is not better in any way, unless you count less of it and slower being better
 
oh yeah, and CPU-Z will show you "dual" just for you won't cry, right?


because VOLUME matters... sometimes...;):D


oh yeah, another "console theory". In console, ya average Linux-BS runs on a 64 MB of RAM, Win 10 needs 4 GB to run ITSELF properly, Win 11 way more! Running game with 90+% RAM used IS NOT "OK".:rolleyes::oops:
It's not a theory, it's easily verifiable fact:
  • XBOX runs a stripped down version of Windows with a 1.5GB footprint, not "64MB Linux-BS" Desktop.
  • Windows 10/11 runs in a dynamic amount of RAM that is typically around 4-6GB on a system with 16GB RAM.
    • On a 128GB machine, W10Enterprise is currently using 18GB
    • On a new 32GB W11 Thinkpad I unpacked this morning, the Lenovo image of Windows 11 Pro that I'm about to wipe is using 5.8GB sitting idle at the desktop.
So, Lets say Windows uses 6GB, the game uses 6GB, and there's still enough RAM left to do plenty of browser/background jobs. It's not rocket science, and Windows is actually half-decent at compressing background crap without really impacting any foreground applications or services.

There are hundreds of streamers who have uploaded thousands of hours of YouTube videos with MSI afterburner overlays showing modern games running just fine and the total system memory footprint as little as 10GB, though admittedly it's normally 11-15GB. That's Windows using all the RAM it wants, the game using all the RAM it wants, and there still being plenty of headroom free for other stuff.

I'm not saying that 32GB isn't nice to have, but adding more RAM costs money and provides ZERO performance unless you are actually short of RAM.
 
It's not a theory, it's easily verifiable fact:
  • XBOX runs a stripped down version of Windows with a 1.5GB footprint, not "64MB Linux-BS" Desktop.
  • Windows 10/11 runs in a dynamic amount of RAM that is typically around 4-6GB on a system with 16GB RAM.
    • On a 128GB machine, W10Enterprise is currently using 18GB
    • On a new 32GB W11 Thinkpad I unpacked this morning, the Lenovo image of Windows 11 Pro that I'm about to wipe is using 5.8GB sitting idle at the desktop.
So, Lets say Windows uses 6GB, the game uses 6GB, and there's still enough RAM left to do plenty of browser/background jobs. It's not rocket science, and Windows is actually half-decent at compressing background crap without really impacting any foreground applications or services.

There are hundreds of streamers who have uploaded thousands of hours of YouTube videos with MSI afterburner overlays showing modern games running just fine and the total system memory footprint as little as 10GB, though admittedly it's normally 11-15GB. That's Windows using all the RAM it wants, the game using all the RAM it wants, and there still being plenty of headroom free for other stuff.

I'm not saying that 32GB isn't nice to have, but adding more RAM costs money and provides ZERO performance unless you are actually short of RAM.
The operating system reserves more RAM if you have more available. Therefore, OS RAM usage cannot be tracked accurately. I have a HTPC with only 8 GB, and an up-to-date Windows 10 uses about 4 on it, and it's smooth as any other system.
 
The operating system reserves more RAM if you have more available. Therefore, OS RAM usage cannot be tracked accurately. I have a HTPC with only 8 GB, and an up-to-date Windows 10 uses about 4 on it, and it's smooth as any other system.
AFAIK 4GB is the recommended minimum spec for Windows 10, not the actual minimum spec. Times have changed since 2021, but check my system specs; That thing used to handle plenty of browser tabs in Chrome whilst also running a few background processes like qBittorent, and sometimes a WinXP VM as well.

It was a 32-bit system and IIRC there were variants that shipped with 1GB of RAM which was the minimum spec for Windows 8.1 at the time.
 
I can boot Nanoserver in a VM on as little as 256MB and it FEELS rough.
Win10PE boots fine on 2GB, which is fine on a 16 year old CPU.
In my early Win8.1+VR days 4GB was painful. 8GB was tolerable.
Win10 is okay on 4ish if NOTHING is running and proves the Surface 3 is eWaste.
8GB should have been THE bare minimum option for any kind of workstation since 2015.
From my workstation, ~20GB gets used at any moment.
Usually has Steam, Notepad, RemoteDesktop, PowerShell and like 30 Brave tabs open (I've calmed down a bit).
The eMachines still proves 2GB is the minimum for storage server, so there's that too.
1739315233779.png
1739315370026.png

Still feels like nobody needs over 32GB but the moment I open Unity, Blender and SteamVR it EATS.
 
It's not a theory, it's easily verifiable fact:
  • XBOX runs a stripped down version of Windows with a 1.5GB footprint, not "64MB Linux-BS" Desktop.
  • Windows 10/11 runs in a dynamic amount of RAM that is typically around 4-6GB on a system with 16GB RAM.
    • On a 128GB machine, W10Enterprise is currently using 18GB
    • On a new 32GB W11 Thinkpad I unpacked this morning, the Lenovo image of Windows 11 Pro that I'm about to wipe is using 5.8GB sitting idle at the desktop.
So, Lets say Windows uses 6GB, the game uses 6GB, and there's still enough RAM left to do plenty of browser/background jobs. It's not rocket science, and Windows is actually half-decent at compressing background crap without really impacting any foreground applications or services.

There are hundreds of streamers who have uploaded thousands of hours of YouTube videos with MSI afterburner overlays showing modern games running just fine and the total system memory footprint as little as 10GB, though admittedly it's normally 11-15GB. That's Windows using all the RAM it wants, the game using all the RAM it wants, and there still being plenty of headroom free for other stuff.

I'm not saying that 32GB isn't nice to have, but adding more RAM costs money and provides ZERO performance unless you are actually short of RAM.
TY.:)
as about "new thinkpad", reinstall windows from fresh original ISO and be happy with more free RAM & less CPU-f***king useLESS processes from all that "support" BS. Same rule number 1 goes to ALL NEW laptops with Windows preinstalled.;)
 
TY.:)
as about "new thinkpad", reinstall windows from fresh original ISO and be happy with more free RAM & less CPU-f***king useLESS processes from all that "support" BS. Same rule number 1 goes to ALL NEW laptops with Windows preinstalled.;)
As mentioned, it was about to be wiped. We have our own image because nobody who works in IT trusts an OS that they didn't configure from scratch themselves. Thinkpads being pro-tier hardware aren't bad at all when it comes to vendor bloat but I have no way to know for sure what Lenovo have done to the preinstalled OS and I have an image that overwrites the disk with our deployment of Win11, all the local policies, a bunch of legal crap that out-of-the-box installs don't have, and around 200GB of required software preinstalled.

That would take me a day per laptop to install manually, I'm in the habit of prepping multiple laptops for the company in a morning via a mostly automated, unattended process ;)
 
As mentioned, it was about to be wiped. We have our own image because nobody who works in IT trusts an OS that they didn't configure from scratch themselves. Thinkpads being pro-tier hardware aren't bad at all when it comes to vendor bloat but I have no way to know for sure what Lenovo have done to the preinstalled OS and I have an image that overwrites the disk with our deployment of Win11, all the local policies, a bunch of legal crap that out-of-the-box installs don't have, and around 200GB of required software preinstalled.

That would take me a day per laptop to install manually, I'm in the habit of prepping multiple laptops for the company in a morning via a mostly automated, unattended process ;)
I see someone else here is serious about deployment. I only recently found those tools and my god do they save me some time.
 
Back
Top