0

I am trying to add or set hive runtime parameters while submitting a job. But it is not working it at all. It is working fine when we try to run and insert into hive console. But it not working in spark-shell.

It is running blank. Nothing is getting inserted into table

val sqlAgg =
  s"""
     |set tez.task.resource.memory.mb=5000;
     |SET hive.tez.container.size=6656;
     |SET hive.tez.java.opts=-Xmx5120m;
     |set hive.optimize.ppd=true;
     |set hive.execution.engine=tez; 
     |INSERT INTO table Partition(JobID=$jobId)
     |SELECT
     |UUID() AS Key,
     |a,
     |b,
     |SUM(dc_1) AS dc,
     |FROM tablenametask where jobid=$jobId
     |GROUP BY
     |a,
     |b,
     |c
     """.stripMargin
     
hive.executeUpdate(sqlAgg)
dataeng
  • 11
  • 4
  • Check this link : https://stackoverflow.com/questions/32586793/howto-add-hive-properties-at-runtime-in-spark-shell – HArdRe537 Oct 13 '22 at 07:05

1 Answers1

0

Try using spark.sql ! It should work.

  • You mean, in this way ? spark.sql(sqlAgg) – dataeng Oct 10 '22 at 05:51
  • Not working !! Reason --> spark.sql will execute the statement in in-memory. on the otherhand, hive.excute will execute the things in hive directly using hiveContext. So if I want to use hive properties( set .... ) this should be execute it on hive console. – dataeng Oct 10 '22 at 06:06