I am using a large module in many of my files that takes some time to import. Will importing it in every file waste time?

Question:

I have a module that takes a while to import, let’s call it big_module. This module creates several variables that I need in my other files. I use this module in many of my helper files, called helper1, helper2, etc…

I have a main file that imports each helper file, so my files would look like this:

# helper1.py

import big_module

def do_stuff1(input):
    # code that uses big_module
# helper2.py

import big_module

def do_stuff2(input):
    # code that uses big_module

and so on for the helper files. Then I have my main file:

# main.py

import helper1
import helper2
# and so on

data1 = [some data]
data2 = helper1.do_stuff1(data1)
data3 = helper1.do_stuff2(data2)
# and so on

When I import each helper, and each helper subsequently imports big_module, does big_module get rerun every time, causing me to lose time, or does python cache it or something so that it is only run once? And if importing this in several files does waste time, is there a good way to only have to import it once?

Asked By: Elan SK

||

Answers:

No. The importer will check whether the module has already been imported. If so, you get a reference to the existing module, it is not reimported. You could test this by adding a print at module level to the .py file. You will only see that print once.

It would be very bad if python reimported the module on every import. That would mean that each importer would see a different namespace for each import. If the imported module had global variables, it would have a different set of them for each import and it would be difficult to hold state that is valid for the entire program.

Answered By: tdelaney
Categories: questions Tags: , ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.