0

I saw an email indicating the sunset of support for 1.6 apache spark within IBM Cloud. I am pretty sure my version is 2.x, but I wanted to confirm. I couldn't find anywhere in the UI that indicated the version, and the bx cli command that I thought would show it didn't.

[chrisr@oc5287453221 ~]$ bx service show "Apache Spark-bc"
Invoking 'cf service Apache Spark-bc'...


Service instance: Apache Spark-bc
Service: spark
Bound apps: 
Tags: 
Plan: ibm.SparkService.PayGoPersonal
Description: IBM Analytics for Apache Spark for IBM Cloud.
Documentation url: https://www.ng.bluemix.net/docs/services/AnalyticsforApacheSpark/index.html
Dashboard: https://spark-dashboard.ng.bluemix.net/dashboard

Last Operation
Status: create succeeded
Message: 
Started: 2018-01-22T16:08:46Z
Updated: 2018-01-22T16:08:46Z

How do I determine the version of spark that I am using? Also, I tried going to the "Dashboard" URL from above, and I got an "Internal Server Error" message after logging in.

The information found on How to check the Spark version doesn't seem to help, because it seems to be related to locally installed spark instances. I need to find out the information from the IBM Cloud (ie. Bluemix) using either the UI or the bluemix CLI. Other possibilities would be running some command from a Jupyter Notebook in iPython running in Data Science Experience (part of IBM Cloud).

Chris Ratcliffe
  • 116
  • 1
  • 10
  • I updated my question to clarify that the instance of spark is running in the IBM Cloud, and I am looking to find out the version via the IBM Cloud (ie. Bluemix) UI or CLI, or from DSX. – Chris Ratcliffe Jan 31 '18 at 15:45
  • The Spark service itself is not version specific. To find out whether or not you need to migrate you need to inspect the apps/tools that utilize the service. For example if you've created notebooks in DSX you associated them with a kernel that was bound to a specific Spark version and you'd need to open each notebook to find out which Spark version they are utilizing. – ptitzler Jan 31 '18 at 16:32
  • I see, thanks. All of my notebooks are bound to Spark 2.0. – Chris Ratcliffe Jan 31 '18 at 16:36
  • Can you answer your own question then @ChrisRatcliffe instead of leaving it open. – Dr G. Feb 28 '18 at 00:29

1 Answers1

0

The answer was given by ptitzler above, just adding an answer as requested by the email I was sent.

The Spark service itself is not version specific. To find out whether or not you need to migrate you need to inspect the apps/tools that utilize the service. For example if you've created notebooks in DSX you associated them with a kernel that was bound to a specific Spark version and you'd need to open each notebook to find out which Spark version they are utilizing. – ptitzler Jan 31 at 16:32

Chris Ratcliffe
  • 116
  • 1
  • 10