I'm running a Spark job on EMR Serverless and uploaded to the cluster a Python library I developed, compressed to zip - but the issue is about Python.
Let's say I have a project with the following structure -
├── package_a
│ └──__init__.py
│ └──class_a.py
├── package_b
│ └──__init__.py
│ └──class_b.py
├── main.py
├── requirements.txt
And the file content -
package_a
__init__.py
from .class_a import *
class_a.py
from ..package_b.class_b import b
class a:
def __init__(self):
self.b = b()
package_b
__init__.py
from .class_b import *
class_b.py
class b:
def __init__(self):
...
The question is, why should I use ..
before package_b
in file class_a.py
?
Without it I get the error -
ModuleNotFoundError: No module named 'package_b'