0

I have logged in to ma1-dsrqt-lapp101.corp.apple.com server inside the server file location is (Note : this file is not there in HDFS Location) /ngs/app/idmaerqt/reports/BD_REPORTS_RK/missingRecs.txt

I am trying to read this file with below code using spark 1.6.2 but not able to read the file.

val inputPath = "/ngs/app/idmaerqt/reports/BD_REPORTS_RK/missingRecs.txt"
inputRecords = sc.textFile("file:///"+inputPath)

Please let me know where I am doing wrong, any help appreciated.

Thanks for your help.

GabrieleMartini
  • 1,665
  • 2
  • 19
  • 26
Krishna
  • 49
  • 1
  • 9
  • 2
    [How to load local file in sc.textFile, instead of HDFS](https://stackoverflow.com/q/27299923/10465355) in particular [Aklank Jain answer](https://stackoverflow.com/a/47631339/10465355) - _While Spark supports loading files from the local filesystem, it requires that the files are available at the same path on all nodes in your cluster_ – 10465355 Jan 17 '19 at 13:16
  • I am using the same approach but still getting the same issue, can you please elaborate thanks – Krishna Jan 17 '19 at 13:21
  • @Krishna if the file is small enough, you may read it with normal Scala code, and turn it into a local **Seq** which then you can convert to an **RDD** / **DataFrame** / **Dataset** `Source.fromFile(path).getLines().toDF` - however, this is only for a development / testing perspective, in production prefer always **HDFS** or **S3** or other kind of distributed filesytem. – Luis Miguel Mejía Suárez Jan 17 '19 at 13:53

0 Answers0