I am working with Apache Spark through Maven, and I am trying to modify the source by including a 3rd party jar and trying to utilize some methods within it.
I get the following error when compiling the Spark project using
mvn -Dhadoop.version=2.2.0 -Dscala-2.11 -DskipTests clean package
not found: object edu
[ERROR] import edu.xxx.cs.aggr._
I modified ResultTask.scala
to contain an import statement. So, maven is unable to find the jar I am trying to use and link it with the project.
I have added a dependency to the pom.xml file such as this:
<dependency>
<groupId>edu.xxx.cs</groupId>
<artifactId>aggr</artifactId>
<version>0.99</version>
<scope>system</scope>
<systemPath>${basedir}/aggr.jar</systemPath>
</dependency>
The jar file I am trying to link is located in the same directory as the spark pom.xml file. I added this dependency to pom.xml. I inserted it in between 2 existing dependencies within the pom.xml file. I'm not sure whats wrong, but I would just like the jar to get linked for now, so that I can use the methods within it. I'm also not sure if I should be using anything specific for the groupId, artifactId, and version. edu.xxx.cs.aggr is the root package which contains other source files and packages. I would appreciate any help.
UPDATE
I used
mvn install:install-file -Dfile=<path-to-file> -DgroupId=edu.xxx.cs -DartifactId=aggr -Dversion=0.99 -Dpackaging=jar to install the jar to the .m2 repo. I checked the repo to see if it was installed, and it was.
I changed the dependency in pom.xml to
<dependency>
<groupId>edu.purdue.cs</groupId>
<artifactId>aggr</artifactId>
<version>0.99</version>
</dependency>
I still get the same error.