4

The following code simply writes data into avro format and reads and displays the same from the avro file written too. I was just trying out the example in the Hadoop definitive guide book. I was able to execute this first time. Then I got the following error. It did work for the first time. So I am not sure wat mistake i am making.

This is the exception:

Exception in thread "main" java.io.EOFException: No content to map to Object due to end of input
    at org.codehaus.jackson.map.ObjectMapper._initForReading(ObjectMapper.java:2173)
    at org.codehaus.jackson.map.ObjectMapper._readValue(ObjectMapper.java:2106)
    at org.codehaus.jackson.map.ObjectMapper.readTree(ObjectMapper.java:1065)
    at org.codehaus.jackson.map.ObjectMapper.readTree(ObjectMapper.java:1040)
    at org.apache.avro.Schema.parse(Schema.java:895)
    at org.avro.example.SimpleAvro.AvroExample.avrocreate(AvroDataExample.java:23)
    at org.avro.example.SimpleAvro.AvroDataExample.main(AvroDataExample.java:55)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:156)

This is the code:

package org.avro.example.SimpleAvro;

import java.io.File;
import java.io.IOException;

import org.apache.avro.Schema;
import org.apache.avro.file.DataFileReader;
import org.apache.avro.file.DataFileWriter;
import org.apache.avro.generic.GenericData;
import org.apache.avro. generic.GenericDatumReader;
import org.apache.avro.generic.GenericDatumWriter;
import org.apache.avro.generic.GenericRecord;
import org.apache.avro.io.DatumReader;
import org.apache.avro.io.DatumWriter;

class AvroExample{

    AvroExample(){

    }
    void avrocreate() throws Exception{

        Schema schema=Schema.parse(getClass().getResourceAsStream("Pair.avsc"));

        GenericRecord datum=new GenericData.Record(schema);
        datum.put("left", "L");
        datum.put("right", "R");

        File file=new File("data.avro");
        DatumWriter<GenericRecord> writer=new GenericDatumWriter<GenericRecord>(schema);
        DataFileWriter<GenericRecord> dataFileWriter=new DataFileWriter<GenericRecord>(writer);
        dataFileWriter.create(schema, file);
        dataFileWriter.append(datum);
        dataFileWriter.close();

        System.out.println("Written to avro data file");
        //reading from the avro data file

        DatumReader<GenericRecord> reader= new GenericDatumReader<GenericRecord>();
        DataFileReader<GenericRecord> dataFileReader=new DataFileReader<GenericRecord>(file,reader);
        GenericRecord result=dataFileReader.next();
        System.out.println("data" + result.get("left").toString());

        result=dataFileReader.next();
        System.out.println("data :" + result.get("left").toString());


    }

}
public class AvroDataExample {
    public static void main(String args[])throws Exception{

        AvroExample a=new AvroExample();
        a.avrocreate();
    }



}

The following is the Pair.avsc file [ given in the book's example code]

{
  "type": "record",
  "name": "Pair",
  "doc": "A pair of strings.",
  "fields": [
    {"name": "left", "type": "string"},
    {"name": "right", "type": "string"}
  ]
}
tshepang
  • 12,111
  • 21
  • 91
  • 136
Sri
  • 201
  • 2
  • 4
  • 6
  • when i tried to embed the schema as a string, i was able to successfully run the program. – Sri Apr 05 '11 at 00:06

3 Answers3

4

You are probably not reading the schema file correctly. I suspect this is the problem because the stack trace shows that it is failing to parse the schema:

Exception in thread "main" java.io.EOFException: No content to map to Object due to end of input
    at org.codehaus.jackson.map.ObjectMapper._initForReading(ObjectMapper.java:2173)
    at org.codehaus.jackson.map.ObjectMapper._readValue(ObjectMapper.java:2106)
    at org.codehaus.jackson.map.ObjectMapper.readTree(ObjectMapper.java:1065)
    at org.codehaus.jackson.map.ObjectMapper.readTree(ObjectMapper.java:1040)
    at org.apache.avro.Schema.parse(Schema.java:895)

Reading files from "resources" is fraught with problems unless you have your environment set up just right. Also, since you mentioned that it worked once before, you may just have changed some environmental setting (such as working directory) for the second runs.

Try copy-pasting the schema string into a String variable and parse it directly rather than using the resource loader:

String schemaJson = "paste schema here (and fix quotes)";
Schema schema = Schema.parse(schemaJson);
GenericRecord datum = new GenericData.Record(schema);
...
pawstrong
  • 924
  • 7
  • 17
  • I had a same problem and I fixed it with putting the schema in code. But I don't know what's wrong with environmental setting which it couldn't find the avsc file. could you help me please? – gabi Jun 03 '14 at 10:00
1
    GenericRecord result=dataFileReader.next();
    System.out.println("data" + result.get("left").toString());
    result=dataFileReader.next();
    System.out.println("data :" + result.get("left").toString());

I guess this is where you are going wrong.

You should call the "left" attribute and the "right" attribute of your record.

Try it.

It worked for me.

dplante
  • 2,445
  • 3
  • 21
  • 27
RimiD
  • 101
  • 1
  • 1
  • 4
0

If the file is on the root of your jar place a slash before the file name.

Schema.parse(getClass().getResourceAsStream("/Pair.avsc"));