64

I want to check the spark version in cdh 5.7.0. I have searched on the internet but not able to understand. Please help.

user4157124
  • 2,809
  • 13
  • 27
  • 42
Ironman
  • 1,330
  • 2
  • 19
  • 40
  • See [When is it justifiable to downvote a question?](http://meta.stackoverflow.com/questions/252677/when-is-it-justifiable-to-downvote-a-question) on [meta]. – tripleee Jul 27 '16 at 07:27
  • 1
    but i tried few things before posting it. I didn't find any help thats why i posted here – Ironman Jul 27 '16 at 07:28
  • 1
    Then show us your efforts. – tripleee Jul 27 '16 at 07:28
  • Programmatically finding spark version would be best. That's for sure the one that my code is using. How to do it in python? Thanx – Geoffrey Anderson May 24 '18 at 16:25
  • #print(pyspark.__version()__) #print(pyspark.version()) import pyspark #print(pyspark.version()) #print(pyspark.__version()__) print(pyspark.__version__) # this works – Geoffrey Anderson May 24 '18 at 16:28

3 Answers3

125

Addition to @Binary Nerd

If you are using Spark, use the following to get the Spark version:

spark-submit --version

or

Login to the Cloudera Manager and goto Hosts page then run inspect hosts in cluster

tk421
  • 5,775
  • 6
  • 23
  • 34
BruceWayne
  • 3,286
  • 4
  • 25
  • 35
33

You can get the spark version by using the following command:

spark-submit --version

spark-shell --version

spark-sql --version

You can visit the below site to know the spark-version used in CDH 5.7.0

http://www.cloudera.com/documentation/enterprise/release-notes/topics/cdh_rn_new_in_cdh_57.html#concept_m3k_rxh_1v

2

According to the Cloudera documentation - What's New in CDH 5.7.0 it includes Spark 1.6.0.

Binary Nerd
  • 13,872
  • 4
  • 42
  • 44