I am using Python 3.8's new shared_memory module and fail to free the shared memory without terminating the processes using it.
After creating and using a block shm of shared memory, I close it via shm.close() in all processes and finally free it via shm.unlink in the main process. However, the reseource monitor shows me that the memory is not freed up until the program is terminated. This is a serious problem for me, because my program needs to run for a long time. The problem can be reproduced on Windows/Python 3.8 with the following program:
from multiprocessing import shared_memory, Pool
from itertools import repeat
from time import sleep
def fun(dummy, name):
# access shared memory
shm = shared_memory.SharedMemory(name=name)
# do work
sleep(1)
# release shared memory
shm.close()
return dummy
def meta_fun(pool):
# create shared array
arr = shared_memory.SharedMemory(create=True, size=500000000)
# compute result
result = sum(pool.starmap(fun, zip(range(10), repeat(arr.name))))
# release and free memory
arr.close()
arr.unlink()
return result
if __name__ == '__main__':
# use one Pool for many method calls to save the time for repeatedly
# creating processes
with Pool() as pool:
for i in range(100):
print(meta_fun(pool))
Caution: when executing this script, you may quickly fill your entire memory! Watch the "virtual memory" panel in the resource monitor.
After doing some research, I found out that (1) the unlink() function does nothing on Windows:
def unlink(self):
"""Requests that the underlying shared memory block be destroyed.
In order to ensure proper cleanup of resources, unlink should be
called once (and only once) across all processes which have access
to the shared memory block."""
if _USE_POSIX and self._name:
from .resource_tracker import unregister
_posixshmem.shm_unlink(self._name)
unregister(self._name, "shared_memory")
and (2) Windows seems to free up shared memory once the processeses that created/used it have stopped (see the comments here and here). This may be the cause for Python not handling this explicitly.
In response I have built an ugly workaround via saving and reusing the same shared memory block repeatedly without ever unlinking it. Obviously, this is not a satisfactory solution, especially if the sizes of the needed memory blocks change dynamically.
Is there a way I can manually free up the shared memory on Windows?
from 'unlink()' does not work in Python's shared_memory on Windows
No comments:
Post a Comment