You have to install the packages on all worker nodes. You could use cssh to make your life a bit easier.
An alternative to installing every pip package in advance is to use a requirements.txt (and preferentially an virtualenvironment). To use a requirements.txt just launch spark-submit with the following parameters:
--conf spark.pyspark.virtualenv.enabled=true
--conf spark.pyspark.virtualenv.type=native
--conf spark.pyspark.virtualenv.requirements=/Users/jzhang/github/spark/requirements.txt
--conf spark.pyspark.virtualenv.bin.path=/Users/jzhang/anaconda/bin/virtualenv
--conf spark.pyspark.python=/usr/local/bin/python3 spark_virtualenv.py
Please find further information at 2.