1

I have used google-cloud-storage-connector for Hadoop and able to run mapreduce job that takes input from my local HDFS (Hadoop running in my local machine) and places the result in Google Cloud Storage bucket.

Now I want to run a mapreduce job using google-datastore-connector for Hadoop that takes input from local HDFS (Hadoop running in my local machine) and places the result in a Cloud Datastore kind (kind is synonymous to database table).

Please help me what are all the configurations that I need to give and what steps I have to follow.

Dan McGrath
  • 41,220
  • 11
  • 99
  • 130
  • Do you have a code snippet that shows what you are struggling with? – kaz Jun 03 '15 at 14:54
  • sorry kaz, I dont have any code for this use case. I have read this article http://stackoverflow.com/questions/25291397/migrating-50tb-data-from-local-hadoop-cluster-to-google-cloud-storage for copying data from hdfs to google cloud storage. Wanted to know the possibility of doing the same using google datastore connector for hadoop – hadoop godc Jun 09 '15 at 08:52

0 Answers0