1

When attempting to start any notebook on IBM DSX with the Scala 2.11/Spark 2.0 kernel, I get the following error:

Dead kernel The kernel has died, and the automatic restart has failed. It is possible the kernel cannot be restarted. If you are not able to restart the kernel, you will still be able to save the notebook, but running code will no longer work until the notebook is reopened

I've ensured that all kernels in my other notebooks are stopped, and I've tried changing the Spark version. I'm able to create and start Python/Spark notebooks.

Pål
  • 65
  • 6

1 Answers1

1

This can happen if you put a JAR file that conflicts with the Scala environment into ~/data/libs/. There are version-specific subdirectories, too. For more information, see the DSX documentation: https://datascience.ibm.com/docs/content/analyze-data/importing-libraries.html

From a Python notebook, execute the following to check the contents of you libs directories:

!ls -ARF ~/data/libs/

If you find anything suspicious there, you can also delete files from the Python notebook. For example:

!rm -f ~/data/libs/*.jar

Then restart the Scala kernel to see if that made a difference.


Sometimes, additional information for tracking the problem is available in the kernel log files. List the Scala kernel log files, again from the Python notebook, using:

!ls $SERVICE_HOME/kernel-scala-*.log

Then get the contents of a log file using:

!cat $SERVICE_HOME/kernel-scala-<timestamp>.log
Roland Weber
  • 1,865
  • 2
  • 17
  • 27