Thursday, 23 January 2020

Memory-leaks in hungry Python worker

I have a few workers that is listening to an RabbitMQ queue and that is doing some disk I/O intensive work - opening ~18MB files, doing some parsing and writes to some files. While processing one job a worker could take up to 200MB of memory.. and this is fine.

However, my problem is that the worker continues to be idle and still reserving this amount of memory. I have blindly tried to do some garbage collection manually with gc.collect() after the job is done but without any results.

My worker class that receives the job looks like this:

class BuildWorker(worker.Worker):

    def callback(self, ch, method, properties, body):
        fp = FileParseAndStuff()
        fp.execute_job(ch, method, properties, body)
        fp = None

Shouldn't everything that happens inside fp here be contained memory wise and be removed once I set that object to None ? I have tried to use Python's del statement as well but without any improvments.

I'm using Python 2.7 and python-pika to communicute with the RabbitMQ server, if that matters.



from Memory-leaks in hungry Python worker

No comments:

Post a Comment