There are circumstances were files being fragmented to a severe degree on an SSD can degrade performance slightly. It can also cause uneven sector wear in some instances.
It is senseless and unrealistic to use extreme, rarely seen scenarios to set policies or to use that to try to justify a rule applicable to all, or the vast majority of situations.
A drunk driver could, in certain circumstances, jump the curb, swerve past two trees and land on my porch too. I guess I better not stand on my porch.
A SSD is like a
mail sorting box with a robot arm stuffing and retrieving data chunks into and out of each slot. Except in extreme circumstances that the vast majority of users will never see, it takes no more time if the file segments are distributed in slots 2, 14, 31, 7, 23, and 16 than it does if stuffed in slots 1, 2, 3, 4, 5, 6. For that reason, fragmentation is not a problem with SSDs and SSDs do not need to be defragged. In fact, any defrag program worth its salt will not attempt to defrag a SSD. And uneven wear on SSDs is prevented by TRIM and wear leveling so suggesting uneven sector wear is just misinformation.
That statement assumes that Microsoft's builtin defrag service does it's job as intended. In practice, that service is interrupted frequently and ends up making things worse over the long run.
Oh bullfeathers! That's one of the silliest things I've heard in a long time. Please show us a
current study using today's typical monster drives that says Microsoft's built in defragger in Windows 10 makes things worse!
For the record, nothing says the process cannot be interrupted. In fact, it is designed that way - to operate in the background when the computer is idle, to step out of the way when the user starts using the computer again. And because it is regularly scheduled, fragmentation is minimal between defragging. It is ridiculous to suggest an entire drive must be defragged without interruption.
It is not like files are left open or fragments are suddenly lost.
Do you want to kill the power or process in Task Manager? Absolutely not as that could lead to file corruption. But killing power or terminating the task in TM is not the same thing as the program
gracefully halting and stepping aside.
Using the method described above, it is a very much a very stable and well performing "set it and forget it" situation.
No it isn't! Clearly you are not the master of virtual memory you think you are. And it is clear you don't know who Mark Russinovich is either - understand Microsoft hired him long after he proved himself as one of the world's top experts. He is not a Microsoft shill.
If you set it with 4GB of RAM installed and then forget it, then install another 4GB of RAM, your PF settings will likely be wrong. If you start using that computer for totally different tasks, your PF settings will likely be wrong. If another user starts using that computer, your PF settings will likely be wrong.
If it was a set and forget, why doesn't Microsoft just pick 1.5 x RAM or 2 x RAM and leave it? Why go through the complex ordeal of making that a dynamic process? Sure, if you have 8GB of RAM and you set your PF to 12GB, you will have enough virtual memory. But if you don't need that much, you just wasted a bunch of disk space - especially with your totally
inefficient suggestion to set not just the maximum, but the minimum at that level too.
FTR, I have 16GB of RAM installed in this system. Microsoft has my PF currently set to 2432MB. I don't need or want an extra 10GB of my SSD obligated to fixed PF size. That would be inefficent and a waste of space. And, for that matter, using up such a large chunk of space could contribute to fragmentation - if this were a hard drive.
Microsoft is not the end-all-be-all of computing. If they were, there would be no need to make laws to control their many instances of unlawful, unethical behavior.
Ah! There it is! Your true biased colors just came out.
I made a specific point to differentiate the developers and the work they do from the marketing and executive people who often make dumb decisions. But you lump them all together as if they come from the same "unethical" mindset - as if setting page file sizes and defragging parameters has something to do with business ethics.
Oh well. I'm outta here.