2

I work a lot with avro files and everytime I start the spark shell I have to do these 5 imports

import org.apache.avro.generic.GenericRecord
import org.apache.avro.mapred.AvroKey
import org.apache.avro.mapreduce.AvroKeyInputFormat
import org.apache.hadoop.io.NullWritable
import org.apache.spark.SparkContext

is it possible that these are automatically imported into the spark shell everytime I start the spark-shell?

Knows Not Much
  • 30,395
  • 60
  • 197
  • 373

1 Answers1

2

I think this has been answered here. Just include your imports in a file (some_file) and either specify it when starting spark-shell

spak-shell -i some_file

or run

scala>:load some_file

after spark-shell starts.

Greg
  • 557
  • 4
  • 20