How to Programmatically Limit My Program's CPU Usage to Below 70%

How can I programmatically limit my program's CPU usage to below 70%?

That's not your concern... It's the job of the operating system to distribute processor time between running processes. If you'd like to give other processes first crack at getting their stuff done, then simply reduce the priority of your own process by modifying the Process.PriorityClass value for it.

See also: Windows Equivalent of ‘nice’

Limit total CPU usage in python multiprocessing

The solution depends on what you want to do. Here are a few options:

Lower priorities of processes

You can nice the subprocesses. This way, though they will still eat 100% of the CPU, when you start other applications, the OS gives preference to the other applications. If you want to leave a work intensive computation run on the background of your laptop and don't care about the CPU fan running all the time, then setting the nice value with psutils is your solution. This script is a test script which runs on all cores for enough time so you can see how it behaves.

from multiprocessing import Pool, cpu_count
import math
import psutil
import os

def f(i):
return math.sqrt(i)

def limit_cpu():
"is called at every process start"
p = psutil.Process(os.getpid())
# set to lowest priority, this is windows only, on Unix use ps.nice(19)
p.nice(psutil.BELOW_NORMAL_PRIORITY_CLASS)

if __name__ == '__main__':
# start "number of cores" processes
pool = Pool(None, limit_cpu)
for p in pool.imap(f, range(10**8)):
pass

The trick is that limit_cpu is run at the beginning of every process (see initializer argment in the doc). Whereas Unix has levels -19 (highest prio) to 19 (lowest prio), Windows has a few distinct levels for giving priority. BELOW_NORMAL_PRIORITY_CLASS probably fits your requirements best, there is also IDLE_PRIORITY_CLASS which says Windows to run your process only when the system is idle.

You can view the priority if you switch to detail mode in Task Manager and right click on the process:

Sample Image

Lower number of processes

Although you have rejected this option it still might be a good option: Say you limit the number of subprocesses to half the cpu cores using pool = Pool(max(cpu_count()//2, 1)) then the OS initially runs those processes on half the cpu cores, while the others stay idle or just run the other applications currently running. After a short time, the OS reschedules the processes and might move them to other cpu cores etc. Both Windows as Unix based systems behave this way.

Windows: Running 2 processes on 4 cores:

OSX: Running 4 processes on 8 cores:

Sample Image

You see that both OS balance the process between the cores, although not evenly so you still see a few cores with higher percentages than others.

Sleep

If you absolutely want to go sure, that your processes never eat 100% of a certain core (e.g. if you want to prevent that the cpu fan goes up), then you can run sleep in your processing function:

from time import sleep

def f(i):
sleep(0.01)
return math.sqrt(i)

This makes the OS "schedule out" your process for 0.01 seconds for each computation and makes room for other applications. If there are no other applications, then the cpu core is idle, thus it will never go to 100%. You'll need to play around with different sleep durations, it will also vary from computer to computer you run it on. If you want to make it very sophisticated you could adapt the sleep depending on what cpu_times() reports.

How do I monitor the computer's CPU, memory, and disk usage in Java?

Along the lines of what I mentioned in this post. I recommend you use the SIGAR API. I use the SIGAR API in one of my own applications and it is great. You'll find it is stable, well supported, and full of useful examples. It is open-source with a GPL 2 Apache 2.0 license. Check it out. I have a feeling it will meet your needs.

Using Java and the Sigar API you can get Memory, CPU, Disk, Load-Average, Network Interface info and metrics, Process Table information, Route info, etc.

Tomcat consuming high CPU

To understand what's happening, you should try to run it under a profiler. Try the YourKit (http://www.yourkit.com/) or Netbeans (http://profiler.netbeans.org/docs/help/5.5/profile_j2ee_profileproject.html).

The YourKit one have better integration with tomcat.

How prevent CPU usage 100% because of worker process in iis

Well, this can take long time to figure out. Few points to narrow it down:

  • Identify what is killing the CPU. I recommend Process Explorer http://technet.microsoft.com/en-us/sysinternals/bb896653.aspx
  • Identify what AppPool is causing this
  • Fix your code

Tracking *maximum* memory usage by a Python function

This question seemed rather interesting and it gave me a reason to look into Guppy / Heapy, for that I thank you.

I tried for about 2 hours to get Heapy to do monitor a function call / process without modifying its source with zero luck.

I did find a way to accomplish your task using the built in Python library resource. Note that the documentation does not indicate what the RU_MAXRSS value returns. Another SO user noted that it was in kB. Running Mac OSX 7.3 and watching my system resources climb up during the test code below, I believe the returned values to be in Bytes, not kBytes.

A 10000ft view on how I used the resource library to monitor the library call was to launch the function in a separate (monitor-able) thread and track the system resources for that process in the main thread. Below I have the two files that you'd need to run to test it out.

Library Resource Monitor - whatever_you_want.py

import resource
import time

from stoppable_thread import StoppableThread

class MyLibrarySniffingClass(StoppableThread):
def __init__(self, target_lib_call, arg1, arg2):
super(MyLibrarySniffingClass, self).__init__()
self.target_function = target_lib_call
self.arg1 = arg1
self.arg2 = arg2
self.results = None

def startup(self):
# Overload the startup function
print "Calling the Target Library Function..."

def cleanup(self):
# Overload the cleanup function
print "Library Call Complete"

def mainloop(self):
# Start the library Call
self.results = self.target_function(self.arg1, self.arg2)

# Kill the thread when complete
self.stop()

def SomeLongRunningLibraryCall(arg1, arg2):
max_dict_entries = 2500
delay_per_entry = .005

some_large_dictionary = {}
dict_entry_count = 0

while(1):
time.sleep(delay_per_entry)
dict_entry_count += 1
some_large_dictionary[dict_entry_count]=range(10000)

if len(some_large_dictionary) > max_dict_entries:
break

print arg1 + " " + arg2
return "Good Bye World"

if __name__ == "__main__":
# Lib Testing Code
mythread = MyLibrarySniffingClass(SomeLongRunningLibraryCall, "Hello", "World")
mythread.start()

start_mem = resource.getrusage(resource.RUSAGE_SELF).ru_maxrss
delta_mem = 0
max_memory = 0
memory_usage_refresh = .005 # Seconds

while(1):
time.sleep(memory_usage_refresh)
delta_mem = (resource.getrusage(resource.RUSAGE_SELF).ru_maxrss) - start_mem
if delta_mem > max_memory:
max_memory = delta_mem

# Uncomment this line to see the memory usuage during run-time
# print "Memory Usage During Call: %d MB" % (delta_mem / 1000000.0)

# Check to see if the library call is complete
if mythread.isShutdown():
print mythread.results
break;

print "\nMAX Memory Usage in MB: " + str(round(max_memory / 1000.0, 3))

Stoppable Thread - stoppable_thread.py

import threading
import time

class StoppableThread(threading.Thread):
def __init__(self):
super(StoppableThread, self).__init__()
self.daemon = True
self.__monitor = threading.Event()
self.__monitor.set()
self.__has_shutdown = False

def run(self):
'''Overloads the threading.Thread.run'''
# Call the User's Startup functions
self.startup()

# Loop until the thread is stopped
while self.isRunning():
self.mainloop()

# Clean up
self.cleanup()

# Flag to the outside world that the thread has exited
# AND that the cleanup is complete
self.__has_shutdown = True

def stop(self):
self.__monitor.clear()

def isRunning(self):
return self.__monitor.isSet()

def isShutdown(self):
return self.__has_shutdown

###############################
### User Defined Functions ####
###############################

def mainloop(self):
'''
Expected to be overwritten in a subclass!!
Note that Stoppable while(1) is handled in the built in "run".
'''
pass

def startup(self):
'''Expected to be overwritten in a subclass!!'''
pass

def cleanup(self):
'''Expected to be overwritten in a subclass!!'''
pass


Related Topics



Leave a reply



Submit