Skip to main content

Uncertainly importing modules with Python [Resolved]

I'm developing a module that should be able to interact with other modules such as numpy or PIL. To accept parameters and test if, for example, a parameter is an numpy.ndarray, I'd need to import numpy first.

What I used to do on previous projects is

numpy_exists = True
    import numpy as np
    numpy_exists = False

Then, when accepting parameters that could be of type I checked it like that:

    if numpy_exists and type(param) == np.ndarray:
        # ...

Now, it works, but it feels very weak because (personally) I look at the statement and the only thing I see is the NameError this code could raise. Of course, it doesn't raise the error, but code like that makes me feel awkward when looking at it.

Is this a good way to handle the problem? Is there a better one?

Question Credit: Yotam Salmon
Question Reference
Asked March 10, 2018
Posted Under: Programming
1 Answers

I think that this is generally fine. But from a design standpoint, I'd probably solve it by moving the type checking for parameters in my own function (or even into a decorator, which looks much nicer). Then you can provide two different functions, depending on whether numpy exists or not, and don't even incur any speed cost. E.g:

def typecheck_without_numpy(...):

def typecheck_with_numpy(...):

    import numpy
    typecheck = typecheck_with_numpy
except ImportError:
    typecheck =  typecheck_without_numpy

def some_function(x):

Of course, you can rename typecheck to something like type_check_and_cast and then either make it convert x to something your function can handle, or raise a TypeError. This way, your code isn't littered by type checks, and as I said, you could move the typecheck into a decorator, so your functions would look something like

def some_function(x):

credit: Pascal
Answered March 10, 2018
Your Answer