0

This thread here showed how to run Python script file with pyspark. Particularly, this is the command I am using:

% pyspark < script.py

I want to pass an argument (a config file) to this script.py. Normally, running with Python alone, this would work:

% python script.py conf.ini

But with pyspark:

% pyspark < script.py conf.ini

I get the following error message:

Error: pyspark does not support any application options.

Is it possible to do this execution?

Mykola Zotko
  • 15,583
  • 3
  • 71
  • 73
Tristan Tran
  • 1,351
  • 1
  • 10
  • 36
  • 1
    Does this answer your question? [Can I add arguments to python code when I submit spark job?](https://stackoverflow.com/questions/32217160/can-i-add-arguments-to-python-code-when-i-submit-spark-job) – Mykola Zotko Jul 03 '22 at 19:24
  • Thanks, using ```spark-submit``` works for me as well. – Tristan Tran Jul 04 '22 at 09:54

1 Answers1

0

Answering this to get it off the unanswered queue. Use spark-submitin combination with sys.argv to get the input:

spark-submit script.py conf.ini
rikyeah
  • 1,896
  • 4
  • 11
  • 21