0

I have an error which displays "No value found for SparkDriver" when I add it to a simple Spark program.

Find size of data stored in rdd from a text file in apache spark

Community
  • 1
  • 1
  • It's not clear what you are asking. – Leandro Mar 08 '16 at 13:33
  • SparkDriver.getContext.addSparkListener(new SparkListener() { override def onStageCompleted(stageCompleted: SparkListenerStageCompleted) { val map = stageCompleted.stageInfo.rddInfos map.foreach(row => { println("rdd memSize " + row.memSize) println("rdd diskSize " + row.diskSize) }) }}) – Hema Nagaiah N Mar 10 '16 at 04:18
  • I have an error when i try the Above code to get the memsize of the RDD in spark... – Hema Nagaiah N Mar 10 '16 at 04:19
  • Please attach the error you got when you execute this code. – Leandro Mar 10 '16 at 12:50

0 Answers0