How to Release Memory After Creating Matplotlib Figures

How can I release memory after creating matplotlib figures

Did you try to run you task function several times (in a for) to be sure that not your function is leaking no matter of celery?
Make sure that django.settings.DEBUG is set False( The connection object holds all queries in memmory when DEBUG=True).

Matplotlib doesn't release memory after savefig and close()

Taken from here: Matplotlib errors result in a memory leak. How can I free up that memory?

Which has original ref: https://www.mail-archive.com/matplotlib-users@lists.sourceforge.net/msg11809.html

To get ax and figure do:

instead of:

import matplotlib.pyplot as plt
fig,ax = plt.subplots(1)

use:

from matplotlib import figure
fig = figure.Figure()
ax = fig.subplots(1)

Also no need to do plt.close() or anything. It worked for me.

How to clear Python/matplotlib memory?

I've been battling this for weeks and the only thing that worked for me was the solution presented here:

How to clear memory completely of all Matplotlib plots

matplotlib.pyplot.figure().clear()
matplotlib.pyplot.close()

The following:

plt.cla()

and

plt.clf() 

didn't work for me at all... I suspect because it was designed for when you have more than one subplot...

Matplotlib runs out of memory when plotting in a loop

Is each loop supposed to generate a new figure? I don't see you closing it or creating a new figure instance from loop to loop.

This call will clear the current figure after you save it at the end of the loop:

pyplot.clf()

I'd refactor, though, and make your code more OO and create a new figure instance on each loop:

from matplotlib import pyplot

while True:
fig = pyplot.figure()
ax = fig.add_subplot(111)
ax.plot(x,y)
ax.legend(legendStrings, loc = 'best')
fig.savefig('himom.png')
# etc....

Matplotlib errors result in a memory leak. How can I free up that memory?

I assume you can run the code you posted at least once. The problem only manifests itself after running the posted code many times. Correct?

If so, the following avoids the problem without really identifying the source of the problem.
Maybe that is a bad thing, but this works in a pinch: Simply use multiprocessing to run the memory-intensive code in a separate process. You don't have to worry about fig.clf() or plt.close() or del a,b or gc.collect(). All memory is freed when the process ends.

import matplotlib
matplotlib.use('Agg')
import matplotlib.pyplot as plt
import numpy as np

import multiprocessing as mp

def worker():
N=1000000
a = np.arange(N)
b = np.random.randn(N)

fig = plt.figure(num=1, dpi=100, facecolor='w', edgecolor='w')
fig.set_size_inches(10,7)
ax = fig.add_subplot(111)
ax.plot(a, b)

fig.savefig('/tmp/random.png') # code gives me an error here

if __name__=='__main__':
proc=mp.Process(target=worker)
proc.daemon=True
proc.start()
proc.join()

You don't have to proc.join() either. The join will block the main process until the worker completes. If you omit the join, then the main process simply continues with the worker process working in the background.

Python matplotlib: memory not being released when specifying figure size

From the docstring for pylab.figure:

In [313]: pylab.figure?

If you are creating many figures, make
sure you explicitly call "close" on
the figures you are not using, because
this will enable pylab to properly
clean up the memory.

So perhaps try:

pylab.close()     # closes the current figure

Matplotlib memory leak when saving figure in a loop

I had the same issue, the following solution worked:

import matplotlib
matplotlib.use('Agg')

Source: https://matplotlib.org/stable/faq/howto_faq.html#work-with-threads

Bug: https://github.com/matplotlib/matplotlib/issues/20300



Related Topics



Leave a reply



Submit