2

I am attempting to run a simple MapReduce process to write HFiles for later import into an HBase table.

When the job is submitted:

hbase com.pcoa.Driver /test /bulk pcoa

I am getting the following exception indicating that netty-3.6.6.Final.jar does not exist in HDFS (it does however exist here).

    -rw-r--r--+ 1 mbeening flprod 1206119 Sep 18 18:25 /dedge1/hadoop/hbase-0.96.1.1-hadoop2/lib/netty-3.6.6.Final.jar

I am afraid that I do not understand how to address this configuration(?) error.

Can anyone provide any advice to me?

Here is the Exception:

    Exception in thread "main" java.io.FileNotFoundException: File does not exist:     hdfs://localhost/dedge1/hadoop/hbase-0.96.1.1-hadoop2/lib/netty-3.6.6.Final.jar
    at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1110)
    at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1102)
    at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1102)
    at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288)
    at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224)
    at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93)
    at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57)
    at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:264)
    at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:300)
    at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:387)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
    at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
    at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1286)
    at com.pcoa.Driver.main(Driver.java:63)

Here is my driver routine:

public class Driver {

public static void main(String[] args) throws Exception {

    Configuration conf = new Configuration();
    Job job = new Job(conf, "HBase Bulk Import");

    job.setJarByClass(HBaseKVMapper.class);
    job.setMapperClass(HBaseKVMapper.class);
    job.setMapOutputKeyClass(ImmutableBytesWritable.class);
    job.setMapOutputValueClass(KeyValue.class);

    job.setInputFormatClass(TextInputFormat.class);

    HTable hTable = new HTable(conf, args[2]);
    HFileOutputFormat.configureIncrementalLoad(job, hTable);

    FileInputFormat.addInputPath(job, new Path(args[0]));
    FileOutputFormat.setOutputPath(job, new Path(args[1]));

    job.waitForCompletion(true);
}
}
  • NOTE: I am on hadoop 2.2.0, hbase 0.96.1.1, zookeeper 3.4.5 – user3042401 Jan 09 '14 at 22:49
  • Is there any other information that I could provide that would help me get a handle on what the issue is. Fundamentally, I don't understand why the process is looking for this jar in the HDFS system? – user3042401 Jan 13 '14 at 17:17
  • processes that do not use this HFile write processes seem to work OK on my cluster - but single-record inserts for my load isn't going work. Any help that can be provided is sincerely appreciated. thanks! – user3042401 Jan 13 '14 at 17:17

1 Answers1

0

I am not sure why/if i had to do this (didn't see anything like this in any of the startup docs anywhere)

but I ran one of these:

hdfs dfs -put /hadoop/hbase-0.96.1.1-hadoop2/lib/*.jar /hadoop/hbase-0.96.1.1-hadoop2/lib

And....my MR job seems to run now

If this is an incorrect course - please let me know thanks!

apesa
  • 12,163
  • 6
  • 38
  • 43