0

I have a package that I have built locally and I wish to distribute for use in another package. After pip install . I am able to import this package and run functions in Python, but only when in the original directory. For example, immediately after installing the package:

>>> import mypkg
>>> mypkg.components.body_columns.dummify
<function dummify at 0x7fb380650e50>

However, as soon as I leave the directory, it can no longer be imported. For example after cd ~/Desktop:

>>> import mypkg
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
ModuleNotFoundError: No module named 'mypkg'

Additionally, at the install location, given with pip show mypkg, there is currently only a mypkg-1.3.0.dist-info directory and no mypkg directory. I have been trying to follow this guidance, but no luck yet, and this seems like a distinct issue.

I am new to trying to distribute packages - any idea what is going on?

If helpful, the (simplified) hierarchical structure is:

mypkg
├── mypkg
│   └── components
│       ├── __init__.py
│       └── body_columns.py
├── __init__.py
├── app.py
└── setup.py

The basal __init__.py reads: from . import mypkg.

1 Answers1

0

I figured this out. It was a basic issue related to where I placed my __init__.py files.

The proper file structure places the first __init__.py not in the root directory, but in the synonymous directory it houses:

mypkg
├── mypkg
│   ├── __init__.py
│   └── components
│       ├── __init__.py
│       └── body_columns.py
├── app.py
└── setup.py

The reason I was able to import with the correct directory is, obvious in hindsight, because I was in the same directory as the mypkg folder itself, i.e., I was just importing as one might import any script that is in one's current directory (though in this case it was the package).