I don't have too much experience using maven and spark, but everything I did so far was in Scala. Now I have to develop a project in Pyspark and I was wondering if there is a possibility to create a project in Pyspark using maven, and if so how I would have to build the pom file.
Because so far in the pom I specified, for example, these properties:
<properties>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
<maven.assembly.plugin.version>3.1.0</maven.assembly.plugin.version>
<maven.antrun.plugin.version>1.8</maven.antrun.plugin.version>
<maven.surefire.plugin.version>3.0.0-M5</maven.surefire.plugin.version>
<maven.surefire.report.plugin.version>2.18.1</maven.surefire.report.plugin.version>
<maven.shade.plugin.version>3.1.1</maven.shade.plugin.version>
<maven.site.plugin.version>3.6</maven.site.plugin.version>
<maven.project.info.reports.plugin.version>2.2</maven.project.info.reports.plugin.version>
<scala.maven.plugin.version>4.1.1</scala.maven.plugin.version>
<maven.scalastyle.plugin.version>1.0.0</maven.scalastyle.plugin.version>
<encoding>UTF-8</encoding>
<scala.version>2.11.12</scala.version>
<spark.version>2.4.0.cloudera2</spark.version>
<hive-service.version>3.1.2</hive-service.version>
<spark.databricks.version>1.5.0</spark.databricks.version>
...
</properties>
Would it be in the same way only changing <scala.version>2.11.12</scala.version> by <python.version>3.6</python.version>? Or something like that?