4

I am trying to build an Apache Spark application in Java in Eclipse. I am using Gradle as my build management system. What should I write in my build.gradle and which gradle commands should I use to do the same thing which this Maven POM and terminal command does?

<project>
  <groupId>edu.berkeley</groupId>
  <artifactId>simple-project</artifactId>
  <modelVersion>4.0.0</modelVersion>
  <name>Simple Project</name>
  <packaging>jar</packaging>
  <version>1.0</version>
  <dependencies>
    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-core_2.10</artifactId>
      <version>1.6.1</version>
    </dependency>
  </dependencies>
</project>

and

mvn package

$ YOUR_SPARK_HOME/bin/spark-submit \
--class "SimpleApp" \
--master local[4] \
target/simple-project-1.0.jar
khateeb
  • 5,265
  • 15
  • 58
  • 114

1 Answers1

4

Try this. If it is a Java project, you can remove the first line:

apply plugin: 'scala'

buildscript {
    repositories {
        mavenCentral()
    }
}

repositories {
    mavenLocal()
    mavenCentral()
}

dependencies{
    compile "org.apache.spark:spark-core_2.10:1.6.1"
}
Daniel Zolnai
  • 16,487
  • 7
  • 59
  • 71
  • This does result in a successful build but in Eclipse the error lines are still appearing for all the imports. The jar is also not created. Is it possible to do `spark-sumbit` from Gradle? – khateeb Jul 19 '16 at 11:48
  • To solve that follow these steps: https://stackoverflow.com/questions/17907927/update-my-gradle-dependencies-in-eclipse, in my case, second answer solved my problem (sorry about my English) – Genarito Jul 23 '17 at 19:52
  • I don't know if submitting the spark job and building the jar both in Gradle is the best idea @khateeb I would use jenkins or even better airflow – Brian Apr 25 '21 at 20:50