Unload a module in Python

Question:

TL/DR:

import gc, sys

print len(gc.get_objects()) # 4073 objects in memory

# Attempt to unload the module

import httplib
del sys.modules["httplib"]
httplib = None

gc.collect()
print len(gc.get_objects()) # 6745 objects in memory

UPDATE
I’ve contacted Python developers about this problem and indeed it’s not going to be possible to unload a module completely “in next five years”. (see the link)

Please accept that Python indeed does not support unloading modules for severe, fundamental, insurmountable, technical problems, in 2.x.


During my recent hunt for a memleak in my app, I’ve narrowed it down to modules, namely my inability to garbage collect an unloaded module. Using any method listed below to unload a module leaves thousands of objects in memory. In other words – I can’t unload a module in Python…

The rest of the question is attempt to garbage collect a module somehow.

Let’s try:

import gc
import sys

sm = sys.modules.copy()  # httplib, which we'll try to unload isn't yet 
                         # in sys.modules, so, this isn't the source of problem

print len(gc.get_objects()) # 4074 objects in memory

Let’s save a copy of sys.modules to attempt to restore it later.
So, this is a baseline 4074 objects. We should ideally return to this somehow.

Let’s import a module:

import httplib
print len(gc.get_objects()) # 7063 objects in memory

We’re up to 7K non-garbage objects.
Let’s try removing httplib from sys.modules.

sys.modules.pop('httplib')
gc.collect()
print len(gc.get_objects()) # 7063 objects in memory

Well, that didn’t work. Hmm, but isn’t there a reference in __main__? Oh, yeah:

del httplib
gc.collect()
print len(gc.get_objects()) # 6746 objects in memory

Hooray, down 300 objects. Still, no cigar, that’s way more than 4000 original objects.
Let’s try restoring sys.modules from copy.

sys.modules = sm
gc.collect()
print len(gc.get_objects()) # 6746 objects in memory

Hmmm, well that was pointless, no change..
Maybe if we wipe out globals…

globals().clear()
import gc # we need this since gc was in globals() too
gc.collect()
print len(gc.get_objects()) # 6746 objects in memory

locals?

locals().clear()
import gc # we need this since gc was in globals() too
gc.collect()
print len(gc.get_objects()) # 6746 objects in memory

What the.. what if we imported a module inside of exec?

local_dict = {}
exec 'import httplib' in local_dict
del local_dict
gc.collect()
print len(gc.get_objects())  # back to 7063 objects in memory

Now, that’s not fair, it imported it into __main__, why? It should have never left the local_dict… Argh! We back to fully imported httplib.
Maybe if we replaced it with a dummy object?

from types import ModuleType
import sys
print len(gc.get_objects())  # 7064 objects in memory

Bloody…..!!

sys.modules['httplib'] = ModuleType('httplib')
print len(gc.get_objects())  # 7066 objects in memory

Die modules, die!!

import httplib
for attr in dir(httplib):
    setattr(httplib, attr, None)
gc.collect()
print len(gc.get_objects())  # 6749 objects in memory

Okay, after all attempts, the best is +2675 (nearly +50%) from starting point… That’s just from one module… That doesn’t even have anything big inside…

Ok, now seriously, where’s my error?
How do I unload a module and wipe out all of it’s contents?
Or is Python’s modules one giant memory leak?

Full source in simpler to copy form: http://gist.github.com/450606

Asked By: Slava V

||

Answers:

Python does not support unloading modules.

However, unless your program loads an unlimited number of modules over time, that’s not the source of your memory leak. Modules are normally loaded once at start up and that’s it. Your memory leak most likely lies elsewhere.

In the unlikely case that your program really does load an unlimited number of modules over time, you should probably redesign your program. 😉

Answered By: Daniel Stutzbach

(You should try to write more concise questions; I’ve only read the beginning and skimmed the rest.) I see a simple problem at the start:

sm = sys.modules.copy()

You made a copy of sys.modules, so now your copy has a reference to the module–so of course it won’t be collected. You can see what’s referring to it with gc.get_referrers.

This works fine:

# module1.py
class test(object):
    def __del__(self):
        print "unloaded module1"
a = test()

print "loaded module1"

.

# testing.py
def run():
    print "importing module1"
    import module1
    print "finished importing module1"

def main():
    run()
    import sys
    del sys.modules["module1"]
    print "finished"

if __name__ == '__main__':
    main()

module1 is unloaded as soon as we remove it from sys.modules, because there are no remaining references to the module. (Doing module1 = None after importing would work, too–I just put the import in another function for clarity. All you have to do is drop your references to it.)

Now, it’s a little tricky to do this in practice, because of two issues:

  • In order to collect a module, all references to the module must be unreachable (as with collecting any object). That means that any other modules that imported it need to be dereferenced and reloaded, too.
  • If you remove a module from sys.modules when it’s still referenced somewhere else, you’ve created an unusual situation: the module is still loaded and used by code, but the module loader doesn’t know about it anymore. The next time you import the module, you won’t get a reference to the existing one (since you deleted the record of that), so it’ll load a second coexisting copy of the module. This can cause serious consistency problems. So, make sure that there are no remaining references to the module before finally removing it from sys.modules.

There are some tricky problems to use this in general: detecting which modules depend on the module you’re unloading; knowing whether it’s okay to unload those too (depends heavily on your use case); handling threading while examining all this (take a look at imp.acquire_lock), and so on.

I could contrive a case where doing this might be useful, but most of the time I’d recommend just restarting the app if its code changes. You’ll probably just give yourself headaches.

Answered By: Glenn Maynard

I’m not sure about Python, but in other languages, calling the equivalent of gc.collect() does not release unused memory – it will only release that memory if/when the memory is actually needed.

Otherwise, it makes sense for Python to keep the modules in memory for the time being, in case they need to be loaded again.

Python's small object manager rarely returns memory back to the Operating System. From here and here. So, stricktly speaking, python has (by design) a kind of memory leak, even when objects are “gc collected”.

Answered By: ilias iliadis

I can not find an authoritative perspective on this in python3 (10 years later) (now python3.8). However, we can do better percentage-wise now.

import gc
import sys

the_objs = gc.get_objects()
print(len(gc.get_objects())) # 5754 objects in memory
origin_modules = set(sys.modules.keys())
import http.client # it was renamed ;)

print(len(gc.get_objects())) # 9564 objects in memory
for new_mod in set(sys.modules.keys()) - origin_modules:
    del sys.modules[new_mod]
    try:
        del globals()[new_mod]
    except KeyError:
        pass
    try:
        del locals()[new_mod]
    except KeyError:
        pass
del origin_modules
# importlib.invalidate_caches()  happens to not do anything
gc.collect()
print(len(gc.get_objects())) # 6528 objects in memory 

only increasing 13%. If you look at the kind of objects that get loaded in the new gc.get_objects, some of them are builtins, source code, random.* utilities, datetime utilities, etc. I am mostly leaving this here as an update to start the conversation for @shuttle and will delete if we can make more progress.

Answered By: modesitt
Categories: questions Tags: ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.