1

I have a python project pypypy with 2 files: __main__.py and foo.py. In __main__.py I simply do import via import foo. It all works fine.

Now, I want to distribute it with pypi. After installing my module I'm execution it with python -m pypypy. When I do that, the import statement doesn't work anymore. However import pypypy.foo does the job.

Should I change all my imports before distribution or there is a better way?

Stan Kurilin
  • 15,614
  • 21
  • 81
  • 132
  • 1
    What python version are you using? The import system differs between the two. Check [this answer](https://stackoverflow.com/a/12173406/8014793) – hurlenko Sep 04 '19 at 09:50
  • @hurlenko python 3.7 – Stan Kurilin Sep 04 '19 at 09:55
  • You should prefer imports relative to the package you distribute (like `import pypypy.foo`). The reason it works in your dev environment might be because of `PYTHONPATH` manipulation. For example Pycharm automatically sets _Add content and source roots to PYTHONPATH_. Also when you run python it automatically adds current working directory to PYTHONPATH – hurlenko Sep 04 '19 at 10:04
  • @hurlenko do you want to post it as an answer? – Stan Kurilin Sep 04 '19 at 10:27

1 Answers1

1

Using absolute imports is strongly suggested as they work consistently across different python versions. Check this answer. In your case you should prefer using import pypypy.foo.

The reason it works in your dev environment might be because of PYTHONPATH manipulation. For example Pycharm automatically sets Add content and source roots to PYTHONPATH. Also when you run python it automatically adds current working directory to PYTHONPATH.

hurlenko
  • 1,363
  • 2
  • 12
  • 17