There has been an extensive SO thread on how to configure PyCharm to work with pyspark - see here.
What that thread does not include is how to add external packages, like the MongoDB connector you are interested in; you can do this by adding the following entry to your spark-defaults.conf
file, located in $SPARK_HOME/conf
:
spark.jars.packages org.mongodb.spark:mongo-spark-connector_2.11:2.2.0
Notice that I am not sure this will work (I suspect not) if you choose to install pyspark with pip
(the last option mentioned in the accepted answer of the above thread, for Spark >= 2.2). Personally, I do not recommend installing pyspark with pip
since, as mentioned in the docs,
The Python packaging for Spark is not intended to replace all of the
other use cases. This Python packaged version of Spark is suitable for
interacting with an existing cluster (be it Spark standalone, YARN, or
Mesos) - but does not contain the tools required to setup your own
standalone Spark cluster.