I'm trying to pass and get arguments in my databricks job it's a spark_python_task type IT IS NOT A NOTEBOOK. I deployed my job with dbx from pycharm. I have deployment.json file where I configure deployment stuff.
Asked
Active
Viewed 2,684 times
1 Answers
2
If you follow documentation about deployment file, you can see that you can specify parameters as parameters
array:
{
"name": "this-parameter-is-required!",
"spark_python_task": {
"python_file": "path/to/entrypoint.py"
},
"parameters": [
"--conf-file",
"conf/test/sample.json"
]
}
Parameters are passed as command-line parameters, so you can get them from the code just using the sys.argv, or built-in argparse library.

Alex Ott
- 80,552
- 8
- 87
- 132
-
When I make this I receive the path of this json as arguments not the elements from the json. Is this expected behavior? – Borislav Blagoev Aug 09 '21 at 13:02
-
1yes - arguments are just whatever you pass. If you have data in file, you need to read them yourself – Alex Ott Aug 09 '21 at 13:24