• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

System unresponsive at max RAM usage

  • Thread starter Thread starter Deleted member 50521
  • Start date Start date
D

Deleted member 50521

Guest
Running some of my work on PC. RAM usage is almost close to limits. Now system becomes unresponsive. I have data stored on HDD. I set up a 256GB page file on my nvme SSD. Somehow the pagefile drive is not seeing any activity. System is not BSOD. I can hear the disk operating. However no mouse movement or anything can be registered on screen. This is a fairly important piece of work which has been running for 3 days. Is such unresponsive behavior normal once RAM is filled up?


C drive is where i put my page file. You can see there is 0 activity

IMG_1626.JPG
 
yeah maximum usage will make even simple input completely unresponsive.
 
Max cpu and ram use will cause it
 
At least you have a SSD as well, I was maxing out my RAM whilst using a 2.5" HDD, it was simply pain.
 
Ended up with a Segmentation fault after all. Seems i was out of RAM. :(
 
You have pagefile on SSD? If you have multiple SSD, & lots of free space, then you could try enabling pagefile on all of them. Maybe that'll help you, when you're constantly running out of RAM more/bigger PF is probably your best option.
 
He said already it doesnt seem to be using it... :)

I take it the e drive is not where yoyr PF is? That is working 100%...
 
My pagefile size was set on "system assigned" so i was not expecting it to actually run out of RAM. The segmentation fault happened at final stage after several days of extreme level ram usage. I suspect either my CPU overclocking is unstable or RAM is unstable at current speed/timing for prolonged workload.


He said already it doesnt seem to be using it... :)

I take it the e drive is not where yoyr PF is? That is working 100%...

No during the seg fault my C drive had 100% activity. According to log file it went passed that previous command successfully.
 
He said already it doesnt seem to be using it... :)

I take it the e drive is not where yoyr PF is? That is working 100%...
He can check the usage through process hacker, it can tell you the exact PF size (refreshed periodically) & the programs using PF on the last tab.

Also set the PF to a fixed size, the more the merrier.
Process-Hacker-Portable_2.png
Process-Hacker-Portable_7.png
 
Last edited:
I thought "system managed " is the prefered setting for pagefile


I could try lower RAM to 2133, after all BWE is not that sensitive to RAM speed.


Or if i can get some cheap load reduced registered ddr4 512gb ram for dirt cheap
 
Depends on use. Though that said, since SSDs, I always set it manually to 2GB (had 8GB+ RAM since that time). Clearly your usage dictates a larger size, but, just saying...
 
This is a fairly important piece of work which has been running for 3 days. Is such unresponsive behavior normal once RAM is filled up?


C drive is where i put my page file. You can see there is 0 activity

View attachment 93871
What is the name of the application you are using that gave the error Segmentation fault?
 
Mothur, 16s ribosomal RNA assembly and clustering program. Running in Linux mode under Windows 10 FCU Ubuntu 16.04 sub system
Will that set up complete a smaller less intensive job with out a error? Like say a 5 hour job?
 
Will that set up complete a smaller less intensive job with out a error? Like say a 5 hour job?

Yep. With a dummy data set that is 1/34 the size it completes in 10hrs without error.


Funny thing is this current data set is considered a tiny one compared to the rest we have coming out of sequencing platforms
 
Yep. With a dummy data set that is 1/34 the size it completes in 10hrs without error.


Funny thing is this current data set is considered a tiny one compared to the rest we have coming out of sequencing platforms
Okay, is this the first time you had an error in a job that last as long as the one that produced an error? Have you ever completed a job like this with out errors and if so what did you change to the set up since then? Example such as FCU?
 
I am trouble shooting this pipeline. Im fairly new to mothur and all the successful runs were on test data set.

So basically any small data sets(<600GB it seems) it has no problem completing

Large datasets are what im trouble shooting now
 
I am trouble shooting this pipeline. Im fairly new to mothur and all the successful runs were on test data set.

So basically any small data sets(<600GB it seems) it has no problem completing

Large datasets are what im trouble shooting now
Well maybe you are onto something with your below quote.
I thought "system managed " is the prefered setting for pagefile


I could try lower RAM to 2133, after all BWE is not that sensitive to RAM speed.


Or if i can get some cheap load reduced registered ddr4 512gb ram for dirt cheap

I would actually get a hold of the Vendor and see what they have to say seeing as how the job sets are so long. https://www.mothur.org/forum/memberlist.php?mode=contactadmin&sid=7fa22c67e332f791ca56d609fbf02f78

EDIT: I was reading in other forums that the first line of advise for this error is to make sure you are using the most recent version of Mothur.

EDIT 2: The most recent version is
Version 1.39.5 located here https://github.com/mothur/mothur/releases/tag/v1.39.5
 
Last edited:
have y
Mothur, 16s ribosomal RNA assembly and clustering program. Running in Linux mode under Windows 10 FCU Ubuntu 16.04 sub system
you had successful computation using this before? I was under the impression this was still a fairly beta feature.
 
It used to be in the day for the amt of memory you have double the swap/paging file.

Windows 7 requires 2GB itself atleast in my case it does. If there is a way to set your program to use set ram I would leave 1GB to spare so if you have 16GB of ram set the program to use 13GB at most
 
Back
Top