Sorry for the newbie Jupyter quesion -
I've installed Jupyter & PySpark using this manual - https://blog.sicara.com/get-started-pyspark-jupyter-guide-tutorial-ae2fe84f594f
All seems to work but I don't have autocomplete for some "nested" functions
For example - running "spark" -> I get spark session
When I press tab after "spark." -> I get the list of possible suggestions such as "read"
But pressing tab after spark.read. don't show anything. Though I would expect to show options such as "csv", "parquat" etc...
Important note - running "spark.read.csv("1.txt")" works
Also - tried applying suggestions from `ipython` tab autocomplete does not work on imported module but it didn't work
What am I missing?