I have a scenario in which we are connecting apache spark with sql server load data of tables into spark and generate aparquet file from it.
Here is a snippet of my code:
val database = "testdb"
val jdbcDF = (spark.read.format("jdbc")
.option("url", "jdbc:sqlserver://DESKTOP-694SPLH:1433;integratedSecurity=true;databaseName="+database)
.option("dbtable", "employee")
.option("driver", "com.microsoft.sqlserver.jdbc.SQLServerDriver")
.load())
jdbcDF.write.parquet("/tmp/output/people.parquet")
It is working fine in spark shell, but I want to automate this in Windows PowerShell, or a Windows Command Script, (batch file), so that it becomes part of a SQL Server job.
I would appreciate any suggestions, or leads.