1

We're trying to upgrade our system to use Java 8 instead of Java 7.

On some process configuration in our Spring-Hadoop xml, we have a definition of a pre-action that is actually a javascript script.

With Java 7 it works fine but when changing java version to java 8, an exception is thrown when trying to run this process (see stack-trace below).

I saw that Java 8 uses a different engine than Java 6 or 7, to run Javascript code (Nashorn instead of Rhino). It seems that this engine works - I tried the jss (http://www.oracle.com/technetwork/articles/java/jf14-nashorn-2126515.html) utility and it works great.

Here's my xml (relevant parts):

    <hdp:script id="prepare-hdfs" location="scripts/prepare-windows-hdfs.js"/>

    <bean id="runner" class="org.springframework.data.hadoop.mapreduce.JobRunner">
        <property name="runAtStartup" value="true" />
        <property name="waitForCompletion" value="false"/>
        <property name="killJobAtShutdown" value="false"/>
        <property name="preAction">
            <list>
                <ref bean="prepare-hdfs" />
            </list>
        </property>
        <property name="jobs">
            <list>
                <ref bean="my-job" />
            </list>
        </property>
    </bean>

And here's the content of the scripts/prepare-windows-hdfs.js file:

if (java.lang.System.getProperty("os.name").startsWith("Windows")) {
    // 0655 = -rwxr-xr-x
     org.apache.hadoop.mapreduce.JobSubmissionFiles.JOB_DIR_PERMISSION.fromShort(0655)
     org.apache.hadoop.mapreduce.JobSubmissionFiles.JOB_FILE_PERMISSION.fromShort(0655)
}

I tried to add to the beginning of the script the following line:

try {load("nashorn:mozilla_compat.js");} catch (e) {} // for Java 8

as suggested in http://docs.spring.io/autorepo/docs/spring-hadoop/2.2.1.RELEASE/reference/html/springandhadoop-fs.html and Switching from Rhino to Nashorn but it didn't help.

I tried to use an inline script. i.e.:

<hdp:script id="prepare-hdfs" language="javascript" >
    // 'hack' default permissions to make Hadoop work on Windows
    try {load("nashorn:mozilla_compat.js");} catch (e) {} // for Java 8

    if (java.lang.System.getProperty("os.name").startsWith("Windows")) {
        // 0655 = -rwxr-xr-x
        org.apache.hadoop.mapreduce.JobSubmissionFiles.JOB_DIR_PERMISSION.fromShort(0655)
        org.apache.hadoop.mapreduce.JobSubmissionFiles.JOB_FILE_PERMISSION.fromShort(0655)
    }
</hdp:script>

However, it also didn't help.

here's the exception thrown:

Caused by: java.lang.IllegalArgumentException: No suitable engine found for extension js at org.springframework.util.Assert.notNull(Assert.java:112) at org.springframework.data.hadoop.scripting.Jsr223ScriptEvaluator.discoverEngine(Jsr223ScriptEvaluator.java:101) at org.springframework.data.hadoop.scripting.Jsr223ScriptEvaluator.evaluate(Jsr223ScriptEvaluator.java:74) at org.springframework.data.hadoop.scripting.Jsr223ScriptRunner.call(Jsr223ScriptRunner.java:75) at org.springframework.data.hadoop.scripting.HdfsScriptRunner.call(HdfsScriptRunner.java:68) at org.springframework.data.hadoop.mapreduce.JobRunner.invoke(JobRunner.java:88) at org.springframework.data.hadoop.mapreduce.JobRunner.call(JobRunner.java:51) at org.springframework.data.hadoop.mapreduce.JobRunner.afterPropertiesSet(JobRunner.java:44) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1477) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1417) ... 20 more

or:

Caused by: java.lang.IllegalArgumentException: No suitable engine found for language javascript at org.springframework.util.Assert.notNull(Assert.java:112) at org.springframework.data.hadoop.scripting.Jsr223ScriptEvaluator.discoverEngine(Jsr223ScriptEvaluator.java:101) at org.springframework.data.hadoop.scripting.Jsr223ScriptEvaluator.evaluate(Jsr223ScriptEvaluator.java:74) at org.springframework.data.hadoop.scripting.Jsr223ScriptRunner.call(Jsr223ScriptRunner.java:75) at org.springframework.data.hadoop.scripting.HdfsScriptRunner.call(HdfsScriptRunner.java:68) at org.springframework.data.hadoop.mapreduce.JobRunner.invoke(JobRunner.java:88) at org.springframework.data.hadoop.mapreduce.JobRunner.call(JobRunner.java:51) at org.springframework.data.hadoop.mapreduce.JobRunner.afterPropertiesSet(JobRunner.java:44) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1477) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1417) ... 20 more

The exact JDKs I use are: jdk1.7.0_79.x86_64 (works) and jdk1.8.0_92.x86_64 (doesn't work).

Community
  • 1
  • 1

1 Answers1

0

Apparently, the classpath was nulled in the application, but rt.jar was added. However, since in java8 the scripts are in the ext directory, they weren't added to the classpath. So, I just added this path to the CP.