0

I have a Mac, and I have several spark sql queries that I need to run on hive data from another computer. I know I need the core-site.xml, hdfs-site.xml, and hive-site.xml files in order to access the hive tables, but will I need to install apache hive on my computer in order to do this? Right now, I have these files in spark/conf through other peoples examples I found on the internet. Will I just need to input the username, password, and connection url for the hive server through these straight from spark for it to work? Thanks!

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245

1 Answers1

1

You would just need Hive clients, not a HiveServer. Spark includes these

You can use any JDBC client (for example, I've had success with DbVizualizer on my Mac) and using the Hive JDBC jar, not necessarily SparkSQL

Similar post - How to connect to a Hive metastore programmatically in SparkSQL?

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245