1

I have to read a large text file (approximately 5 megabyte).

For reading this file I using BufferedReader() but it's cause memory leak and heap growing, Is there any alternative options to optimize my code?

            StringBuffer sb = new StringBuffer();
            BufferedReader reader = new BufferedReader(new FileReader(vCache));
            String line = null;

            while ((line = reader.readLine()) != null) 
            {
                sb.append(line);
            }
iSun
  • 1,714
  • 6
  • 28
  • 57
  • Your log is expensive(String concact) and probably your memory leak... – Johannes Jun 03 '13 at 11:15
  • why do you read a 5 MB file? Please explain the requirement behind this? – krishnakumarp Jun 03 '13 at 11:20
  • @krishnakumarp Well I need to fetch all my web-server database row's, save it on a text file, parse it in a json format, and finally insert it one by one to my local database. – iSun Jun 03 '13 at 11:24
  • do the parsing in your webserver and load the json from android – Rudy Jun 03 '13 at 11:27
  • in that case, try to structure the file such that each db row is in a line of file and process line by line, without keeping it in memory (StringBuffer). See my answer – krishnakumarp Jun 03 '13 at 11:54

6 Answers6

1

Try using InputStream instead of BufferedReader

try {
    InputStream is = new FileInputStream(vCache);
    byte[] b = new byte[is.available()];
    is.read(b);
    String text = new String(b);
}
reidzeibel
  • 1,622
  • 1
  • 19
  • 24
1

I'm guessing you're reading a local file. In this case, you may be better off reading the entire file into a byte array and then converting to String:

InputStream is = new FileInputStream(vCache);
byte[] buffer = new byte[is.available()];
is.read(buffer);
is.close();
jsonContent = new String(buffer, "UTF-8");

However you may still be inviting problems by reading such a large file in Android into memory. I'd say that if you need to read a 5 MB json file, you're probably not structuring your app correctly.

Aleks G
  • 56,435
  • 29
  • 168
  • 265
1

Default bufferSize used by BufferedRedaer is 8KB, but since you are reading line by line accumulation will be more. To improve this you can use :

BufferedReader(Reader in, int sz) <-- use sz with smaller value say 4KB

read(char[] cbuf) <-- constraint cbuf size as that of reader size

close() <-- whatever memory was holded by reader instance now can be GCed

Now your code StringBuffer sb holds all the lines as in complete file content, even after making above changes if required memory (~fileSize) is not available to JVM, you would again end up in OOM issue. I am not sure if that's the case with you, otherwise above should improve local memory spikes a bit.

harsh
  • 7,502
  • 3
  • 31
  • 32
0

You're parsing JSON.

You could make the input file smaller by removing prettifying (e.g. indentation, newlines etc) if it's there.

You could also try a parser that reads directly from streams, hopefully it won't need to buffer everything at once. For example, Android provides JsonReader, which allows you to parse a stream and control the data structures yourself, which means you could use more memory efficient structures, and it also wouldn't buffer the whole stream. Unfortunately, it was added in API level 11, so backward compatibility might be an issue.

One alternative is, if the top level object is an array, split it into several smaller arrays, maybe in different files, parse them separately and merge the subarrays. If the base objects have similar structures you can translate them into Java objects before merging, which would have a more compact memory structure.

Vlad
  • 18,195
  • 4
  • 41
  • 71
  • Thanks Vlad, yes I'm trying to parsing json, could you please give me an example about reading json from stream? – iSun Jun 03 '13 at 11:18
  • try using Jackson JSON Parser, http://wiki.fasterxml.com/JacksonHome, you can directly parse JSON from `inputStream` – reidzeibel Jun 03 '13 at 11:55
0

Your code ... as written ... reads lines and accumulates them in a StringBuilder. The mere fact that you are accumulating the lines is a form of memory leak.

The best way to prevent that leak is to change your application to work like this:

    BufferedReader reader = new BufferedReader(new FileReader(vCache));
    String line = null;
    while ((line = reader.readLine()) != null) {
        process(line);
    }

In other words, DON'T accumulate the lines in memory. Process them as you read them and then discard them.


If your processing is such that you have to accumulate the lines in memory, then you will get better memory usage if you allocate the StringBuilder like this:

    StringBuilder sb = new StringBuilder(fileSizeInCharacters);

That will avoid the need to reallocate, which can (in the worst case) require 3 times as many characters as the file size (in characters).

However you will run into the same problem, sooner or later. Accumulating the file content in memory doesn't scale.


Your comments indicate that this is really a JSON processing problem. Here's a Q&A on the topic of streaming JSON processing:

The idea of a streaming API is that you don't need to convert the JSON "object" into an in-memory tree structure that represents the whole thing.

Community
  • 1
  • 1
Stephen C
  • 698,415
  • 94
  • 811
  • 1,216
0

Send the JSON such that each line corresponds to one complete db row and well formed json. This way you don't have to process the whole file together.

//StringBuffer sb = new StringBuffer();
BufferedReader reader = new BufferedReader(new FileReader(vCache));
String line = null;

while ((line = reader.readLine()) != null)  {
  //Parse JSON
  //Insert into local SQLite DB.
}
krishnakumarp
  • 8,967
  • 3
  • 49
  • 55