I have a Spark job which reads a source table, does a number of map / flatten / DATAFRAME operations and then stores the results into a separate table FROM TEMP TABLE we use for reporting. Currently this job is run manually using the spark-submit script. I want to schedule it to run every night.
is there any way to schedule spark job for batch processing similarly like an nightly batch ETL