0

Is there a way to consume partitions based on the index count?

Am able to find the total number of Spark Partitions using the following API

rdd.partitions.size

Currently, the partitions are consumed using rdd.forEachPartition but is there a way to consume the partition by index?

SaiVikas
  • 162
  • 1
  • 12

1 Answers1

0

Check out spark rdd mapPartitionsWithIndex. This answer should also useful.

Apurba Pandey
  • 1,061
  • 10
  • 21