1

In Python code in the project micropython, sometimes the import statements are inside of functions.

def main():
    import argparse
    cmd_parser = argparse.ArgumentParser(description='A tool to work with MicroPython .mpy files.')
    cmd_parser.add_argument('-d', '--dump', action='store_true',
        help='dump contents of files')
    cmd_parser.add_argument('-f', '--freeze', action='store_true',
        help='freeze files')
    ...

Is this good practice?
Is there a technical reason for this?

Bob
  • 4,576
  • 7
  • 39
  • 107
  • it only imports when needed, which can speed up startup. On the other hand, when modules are missing, you see it when running, which is ... random – Jean-François Fabre Oct 18 '17 at 15:06
  • 1
    @Jean-FrançoisFabre on the other, other hand unless the function is run only once, it slows things down – bendl Oct 18 '17 at 15:08
  • @Jean-FrançoisFabre `when modules are missing, you see it when running` - what does that mean? – Bob Oct 18 '17 at 15:09
  • 4
    It means it only crashes with `ImportError` when the function is called, if the module is missing. So if the function is called very rarely, it might happen in code that appears to be working. – roganjosh Oct 18 '17 at 15:10
  • 3
    ^Which means you have to make sure you're testing your code very thoroughly. Every function that imports something could potentially cause an error – bendl Oct 18 '17 at 15:13
  • The main reason here is likely to avoid polluting the module namespace with imports that are only used by the main entry point - but that's still not considered good practice. – bruno desthuilliers Oct 18 '17 at 15:15
  • It should be noted though, while we're here that the code you're mentioning is made for having a very small footprint. Importing modules only when they're needed can help keep your memory footprint low – bendl Oct 18 '17 at 15:15
  • As well as speeding things up, it also saves RAM, since the module only gets loaded if it's needed. And that can be important on systems that need to run micropython. But @bendl makes a very good point: if the function is called multiple times then having the `import` in the function is less efficient, since the interpreter has to check on every function call if the module is loaded. Of course, this check is much faster than actually loading the module from storage, but still... – PM 2Ring Oct 18 '17 at 15:15
  • 1
    Given that it's imported in a script's `main` function, it's unlikely to bring much speedup (`main` is always called) or memory benefit (`main` is always in the stack). And the script name is not importable, so no other modules will import but not require argument parsing. So I'd guess this is just the programmer's personal style. – snakecharmerb Oct 18 '17 at 15:18
  • @PM2Ring followup question and discussion [python - import inside a function: is memory reclaimed upon function exit](https://stackoverflow.com/questions/46813776/python-import-inside-a-function-is-memory-reclaimed-upon-function-exit) – Bob Oct 18 '17 at 15:25
  • @snakecharmerb I almost always do my imports at the top of the script. But the exception is when I have code that's primarily designed to be imported as a module but which has a `main` for testing purposes (or to allow simple use of some module functions from the command line). So under normal use of such a file, its `main` _isn't_ called. In that case, I may put an import (or two) into the `if __name__ == "__main__":` section. I guess putting the import into the `main()` function itself is kind of equivalent, but I _really_ dislike seeing imports inside function definitions. ;) – PM 2Ring Oct 18 '17 at 15:34
  • @PM2Ring agreed. `mpy-tool.py` can only be called from the command line though, so putting this import inside `main` has no benefit. Maybe the devloper originally thought it might be importable, and had a memory that `argparse` was a heavy import as [this bug](https://bugs.python.org/issue30152) suggests. And yes, imports inside functions are a pet peeve of mine too. – snakecharmerb Oct 18 '17 at 15:48

0 Answers0