5

I have the following piece of code that uses the java 7 features like java.nio.file.Files and java.nio.file.Paths

import java.io.File;
import java.io.IOException;
import java.io.StringWriter;
import java.nio.file.Files;
import java.nio.file.Paths;
import java.util.ArrayList;
import com.fasterxml.jackson.core.type.TypeReference;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.SerializationFeature;
import com.fasterxml.jackson.databind.node.ObjectNode;


public class JacksonObjectMapper {

    public static void main(String[] args) throws IOException {

        byte[] jsonData = Files.readAllBytes(Paths.get("employee.txt"));
        ObjectMapper objectMapper = new ObjectMapper();
        Employee emp = objectMapper.readValue(jsonData, Employee.class);
        System.out.println("Employee Object\n"+emp);
        Employee emp1 = createEmployee();
        objectMapper.configure(SerializationFeature.INDENT_OUTPUT, true);
        StringWriter stringEmp = new StringWriter();
        objectMapper.writeValue(stringEmp, emp1);
        System.out.println("Employee JSON is\n"+stringEmp);
    }
}

Now I have to run the the same code on Java 6 , what are the best possible alternatives other than using FileReader ?

Amit Sharad
  • 1,041
  • 3
  • 12
  • 17

5 Answers5

3

In Files class source you can see that in readAllBytes method bytes are read from InputStream.

public static byte[] readAllBytes(Path path) throws IOException {
        long size = size(path);
        if (size > (long)Integer.MAX_VALUE)
            throw new OutOfMemoryError("Required array size too large");

        try (InputStream in = newInputStream(path)) {
             return read(in, (int)size);
        }
    }

return read(in, (int)size) - here it uses buffer to read data from InputStream.

So you can do it in the same way or just use Guava or Apache Commons IO http://commons.apache.org/io/.

pomkine
  • 1,605
  • 4
  • 18
  • 29
2

Alternative are classes from java.io or Apache Commons IO, also Guava IO can help.

Guava is most modern, so I think it is the best solution for you.

Read more: Guava's I/O package utilities, explained.

MariuszS
  • 30,646
  • 12
  • 114
  • 155
2

If you really don't want to use FileReader(Though I didn't understand why) you can go for FileInputStream.

Syntax:

InputStream inputStream = new FileInputStream(Path of your file);
Reader reader = new InputStreamReader(inputStream);
Helios
  • 851
  • 2
  • 7
  • 22
  • That has exactly the same effect as a `FileReader` and still has the problem of using the default encoding for the current platform rather than a fixed encoding that you know matches that of the file. – Ian Roberts Jan 13 '14 at 10:52
1

You are right to avoid FileReader as that always uses the default character encoding for the platform it is running on, which may not be the same as the encoding of the JSON file.

ObjectMapper has an overload of readValue that can read directly from a File, there's no need to buffer the content in a temporary byte[]:

Employee emp = objectMapper.readValue(new File("employee.txt"), Employee.class);
Ian Roberts
  • 120,891
  • 16
  • 170
  • 183
1

You can read all bytes of a file into byte array even in Java 6 as described in an answer to a related question:

import java.io.RandomAccessFile;
import java.io.IOException;
RandomAccessFile f = new RandomAccessFile(fileName, "r");
if (f.length() > Integer.MAX_VALUE)
    throw new IOException("File is too large");
byte[] b = new byte[(int)f.length()];
f.readFully(b);
if (f.getFilePointer() != f.length())
    throw new IOException("File length changed while reading");

I added the checks leading to exceptions and the change from read to readFully, which was proposed in comments under the original answer.

Community
  • 1
  • 1
Palec
  • 12,743
  • 8
  • 69
  • 138