3

I have installed the Spark 1.5.2 build with Hive on a Linux machine. The default path for the Hive metastore warehouse directory is: /user/hive/warehouse.

  1. Is this a local path or a path to the HDFS? I ask this, because I couldn't search this path in Linux.
  2. If it's an HDFS path (most likely), then can we access it without having installed Hadoop with/without a Spark build?
Jacek Laskowski
  • 72,696
  • 27
  • 242
  • 420
scooby
  • 33
  • 4

3 Answers3

0

you can create warehouse in local as well in your hive-site.xml put

hive.metastore.warehouse.dir file:///tmp

anubhav
  • 342
  • 3
  • 8
-1

Yes, /user/hive/warehouse is an HDFS path. You'll need to install and run Hadoop services (at least namenode, secondary namenode, datanode) to make HDFS available.

facha
  • 11,862
  • 14
  • 59
  • 82
-1

Type jps and see if all services are running. if running check metastore_db is present in the path /user/hive/warehouse