1

I want to insert a record in to the table using Spark SQLContext.

Here is the sample code I have done to retrieve the data

        Class.forName(dbDriver);

        Map<String, String> options = new HashMap<String, String>();
        options.put("url", dbUrl);
        options.put("dbtable", dbTable);
        options.put("driver", dbDriver);

        SparkConf conf = new SparkConf().setAppName("JAVA_SPARK")
                .setMaster("local[2]").set("spark.ui.port‌​", "7077");

        JavaSparkContext jsc = new JavaSparkContext(conf);

        SQLContext sqlContext = new SQLContext(jsc);

        DataFrame dframe = sqlContext.read().format("jdbc")
                .options(options).load();

        dframe.show();

How do I insert a new record in to the table? dframe.show() operation is working fine for me.

M.Prabhu
  • 127
  • 1
  • 2
  • 13

1 Answers1

0

I suppose the Spark-Sql does not provide such functionality as of now, you can make your own implementation of the method and then maybe call it inside an action like foreach to implement the solution of the current scenario,

For more information and best practices of implementing this , please refer here: Inserting Analytic data from Spark to Postgres

Community
  • 1
  • 1
Shivansh
  • 3,454
  • 23
  • 46