0

I have this method that takes a json file from storage and turns it into a bunch of java objects. How do I add bufferred output/input streams in order to speed it up? Or is there another way to optimize the speed?

EDIT: Im not only going to use this for reading from JSON files, so I dont need json to java parsers, I actually need to speed up file oprations using buffers :)

public static ArrayList<String> convertJSONtoArrayList(File jsonStrings) {
        FileInputStream fileInputStream = null;

        try {
            fileInputStream = new FileInputStream(jsonStrings);
        } catch (FileNotFoundException e1) {
            e1.printStackTrace();
        }

        return convertJSONtoArrayList(fileInputStream);
    }

    public static ArrayList<String> convertJSONtoArrayList(InputStream fileInputStream) {
        ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();

        ArrayList<String> arrayListString = new ArrayList<String>();

        int ctr;
        try {
            if (fileInputStream != null) {
                ctr = fileInputStream.read();
                while (ctr != -1) {
                    byteArrayOutputStream.write(ctr);
                    ctr = fileInputStream.read();
                }
            }
            fileInputStream.close();
        } catch (IOException e) {
            e.printStackTrace();
        }

        try {
            // Parse the data into jsonobject to get original data in form of
            // json.
            JSONArray jsonArray = new JSONArray(byteArrayOutputStream.toString());
            int arrayLength = jsonArray.length();
            for (int i = 0; i < arrayLength; i++) {

                JSONObject jsonObject = jsonArray.getJSONObject(i);
                arrayListString.add(jsonObject.getString(Tags.VALUE));

            }

        } catch (Exception e) {
            e.printStackTrace();
        }

        return arrayListString;
    }

My idea for this being possible comes from here: How to speed up unzipping time in Java / Android? - in the answers becomes clear that if BufferedInputStream is added, the operation is sped up really good.

Community
  • 1
  • 1
Kaloyan Roussev
  • 14,515
  • 21
  • 98
  • 180
  • Did you check that file reading is your bottleneck? Are you sure you need to optimize this? Why just don't show a "Loading" dialog? This smells premature optimization. – m0skit0 Jun 30 '14 at 11:01
  • Well I read 10 files and turn them into arraylists and hashmaps and these operations take a lot of time (10-20 seconds) but Im not exactly sure where the problem is – Kaloyan Roussev Jun 30 '14 at 11:03
  • 10-20 seconds might not be a lot of time depending on files size. Anyway if the files are independent, consider threading the readings. – m0skit0 Jun 30 '14 at 11:04
  • readings happen in an asynctask, are you suggesting I open a separate thread for each file simultaneously? How do I do that? Also, isnt Buffering going to solve this>? – Kaloyan Roussev Jun 30 '14 at 11:05
  • In any case I suggest you first spot where the problem happens, then try to find a solution. Optimizing the wrong part will lead you nowhere. Use Android DDMS for checking which methods take most of the time. – m0skit0 Jun 30 '14 at 11:06
  • I mean one thread per file, yes. How you're going to do that is your problem, but it doesn't look very hard (one AsyncTask per file instead of all files, how hard can it be?). Also please note [AsyncTask isn't really multi-threading](http://www.jayway.com/2012/11/28/is-androids-asynctask-executing-tasks-serially-or-concurrently/). – m0skit0 Jun 30 '14 at 11:07
  • @J.Kowalski To truly multi-thread an `AsyncTask`, you want to look into using [`executeOnExecutor()`](http://developer.android.com/reference/android/os/AsyncTask.html#executeOnExecutor%28java.util.concurrent.Executor,%20Params...%29). – Andrew Schuster Jun 30 '14 at 13:05

2 Answers2

1
public static List<String> convertJSONtoArrayList(File f) {
    final StringBuilder jsonString = new StringBuilder(500);
    try {
        final BufferedReader reader = new BufferedReader(new FileReader(f));
        String s;
        while ((s = reader.readLine()) != null) {
            jsonString.append(s);
        }
        reader.close();
    } catch (IOException e) {
        e.printStackTrace();
        return Collections.emptyList();
    }

    try {
        JSONArray jsonArray = new JSONArray(jsonString.toString());
        int arrayLength = jsonArray.length();
        final List<String> result = new ArrayList<String>(arrayLength);
        for (int i = 0; i < arrayLength; i++) {
            JSONObject jsonObject = jsonArray.getJSONObject(i);
            result.add(jsonObject.getString(Tags.VALUE));
        }
        return result;
    } catch (Exception e) {
        e.printStackTrace();
        return Collections.emptyList();
    }
}

or try this (shoule be faster):

public static List<String> convertJSONtoArrayList(File f) {
    ByteBuffer buffer = ByteBuffer.allocate(f.length());
    try {
        ReadableByteChannel channel = Channels.newChannel(new FileInputStream(f));
        channel.read(buffer);
        channel.close();
    } catch (Exception e) {
        e.printStackTrace();
        return Collections.emptyList();
    }

    final String jsonString = new String(buffer.array());

    try {
        JSONArray jsonArray = new JSONArray(jsonString);
        int arrayLength = jsonArray.length();
        final List<String> result = new ArrayList<String>(arrayLength);
        for (int i = 0; i < arrayLength; i++) {
            JSONObject jsonObject = jsonArray.getJSONObject(i);
            result.add(jsonObject.getString(Tags.VALUE));
        }
        return result;
    } catch (Exception e) {
        e.printStackTrace();
        return Collections.emptyList();
    }
}
Autocrab
  • 3,474
  • 1
  • 15
  • 15
-1

Use the same code but wrap a BufferedInputStream around the FileInputStream.

user207421
  • 305,947
  • 44
  • 307
  • 483