4

PROBLEM : I have two java classes that have identical qualified paths.

I am running an EMR job for which, I am packaging all my dependency jars in a single jar and uploading to S3. The EMR cluster is supposed to consume this jar from S3. But I am getting the error :

Exception in thread "main" java.lang.IllegalAccessError: class org.apache.hadoop.fs.s3native.AbstractNativeS3FileSystemStore cannot access its superinterface org.apache.hadoop.fs.s3native.NativeFileSystemStore at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClass(ClassLoader.java:800) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at java.net.URLClassLoader.defineClass(URLClassLoader.java:449) at java.net.URLClassLoader.access$100(URLClassLoader.java:71) at java.net.URLClassLoader$1.run(URLClassLoader.java:361) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:270) at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:861) at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:906) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1411) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:68) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1435) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:260) at org.apache.hadoop.fs.Path.getFileSystem(Path.java:187) at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.setInputPaths(FileInputFormat.java:352) at org.apache.hadoop.mapreduce.lib.input.DelegatingInputFormat.getSplits(DelegatingInputFormat.java:110) at org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:1016) at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1033) at org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:174) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:951) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:904) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1140) at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:904) at org.apache.hadoop.mapreduce.Job.submit(Job.java:501) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:531) at com.amazon.idq.chia.aws.emr.mains.BaseFilterStepMain.configureAndRunJob(BaseFilterStepMain.java:51) at com.amazon.idq.chia.aws.emr.mains.BaseFilterStepMain.run(BaseFilterStepMain.java:84) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79) at com.amazon.idq.chia.aws.emr.mains.FFilterHybridStepMain.main(FFilterHybridStepMain.java:24) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.main(RunJar.java:187)

WHAT I HAVE TRIED TILL NOW : I noted that the visibility of the two versions are different. I thought the Classloader was picking up the NativeFileSystemStore (default access) class from jar1 when the code was expecting the jar2 version of NativeFileSystemStore (public access) class. So, I modified the build script : 1. Unzipped jar1 and jar2. 2. Removed the restrictive class NativeFileSystemStore from jar1. 3. Moved org.apache.hadoop.fs.s3native.* from jar2 to jar1 3. Repackaged the classes to jar1-resolved.jar and jar2-resolved.jar 4. Tried running the EMR job again.

RESULT : Still getting the same error.

Mark O'Connor
  • 76,015
  • 10
  • 139
  • 185
kr.ankit
  • 51
  • 1
  • 5
  • This can help https://stackoverflow.com/questions/62880009/error-through-remote-spark-job-java-lang-illegalaccesserror-class-org-apache-h – CodeRunner May 25 '21 at 15:02

1 Answers1

0

In most cases, the reason for IllegalAccessError is version mismatch. Can you please run "javap -version" or "javap -verbose $classname$ | grep 'major' " on both the files org.apache.hadoop.fs.s3native.NativeFileSystemStore (that one that you have included in your resolved jar) and org.apache.hadoop.fs.s3native.AbstractNativeS3FileSystemStore (that I believe you have coded) and check if they match.

D Das
  • 1