1

Currently I have:

  • 1 file with 9 million lines
  • BufferedReader.readLine() to read every line
  • String.split() to parse every line (columns separated by a pipe)
  • A lot of RAM used (because of String interning?)

The problem is: As you may have guessed, I want to read and parse this file a little better...

Questions:

  • How do I read this relatively big file using the least amount of resources (knowing that every line will need some kind of "split" on pipe)?
  • Can I replace String.split by something else (on lets say, StringBuilder, CharBuffer, ...)?
  • What's the best way to avoid using Strings reading the file until I have split them to their final character sequence?
  • I don't mind using something else then String in my POJOs, if you have anything better?
  • The file will be reload every few hours, if that helps you in giving me a solution?

Thank you :)

AndrewBourgeois
  • 2,634
  • 7
  • 41
  • 58
  • 1
    You should check out my answer on this post: http://stackoverflow.com/questions/5854859/faster-way-to-read-file/5854889#5854889 possible duplicate? – netbrain May 03 '11 at 08:02
  • What is the problem you want to solve? Memory consumption? speed of parsing? What do you do with the information you read? Do you need it all available? Why do you reload every few hours and what do you do at that time with the previous data? – RonK May 03 '11 at 08:03
  • Try using [StringTokenizer](http://download.oracle.com/javase/6/docs/api/java/util/StringTokenizer.html) instead of split. – Daniel Lubarov May 03 '11 at 08:40
  • @RonK: It's a dump from an Oracle DB table, in order to limit the traffic on that database. The file is reloaded because the data can change in that table. The issue is RAM, as 18GB RAM still gives me full GCs, stopping the JVM for 0.1 - 4.0 seconds multiple times during the reload. The data is saved in a HashMap, which is trashed before reloading (by assigning a new HasMap to the reference). @Ido Weinstein: 900 MB. – AndrewBourgeois May 03 '11 at 08:56

1 Answers1

2

A 9 million line file should take less than a few seconds. Most of that time will be spent reading the data into memory. How you split up the data is unlikely to make much difference important.

The BufferedReader and String.split sounds fine to me. I wouldn't use interning unless you are sure this will help. (It won't intern() it for you)

The latest version of Java 6 has some performance improvements in the handling of Strings. I would try Java 6 update 25 to see if it is any faster.


EDIT: Doing some test finds that split is surprisingly slow and you cna improve on it.

public static void main(String... args) throws IOException {
    long start1 = System.nanoTime();
    PrintWriter pw = new PrintWriter("deleteme.txt");
    StringBuilder sb = new StringBuilder();
    for (int j = 1000; j < 1040; j++)
        sb.append(j).append(' ');
    String outLine = sb.toString();
    for (int i = 0; i < 1000 * 1000; i++)
        pw.println(outLine);
    pw.close();
    long time1 = System.nanoTime() - start1;
    System.out.printf("Took %f seconds to write%n", time1 / 1e9);

    {
        long start = System.nanoTime();
        FileReader fr = new FileReader("deleteme.txt");
        char[] buffer = new char[1024 * 1024];
        while (fr.read(buffer) > 0) ;
        fr.close();
        long time = System.nanoTime() - start;
        System.out.printf("Took %f seconds to read text as fast as possible%n", time / 1e9);
    }
    {
        long start = System.nanoTime();
        BufferedReader br = new BufferedReader(new FileReader("deleteme.txt"));
        String line;
        while ((line = br.readLine()) != null) {
            String[] words = line.split(" ");
        }
        br.close();
        long time = System.nanoTime() - start;
        System.out.printf("Took %f seconds to read lines and split%n", time / 1e9);
    }
    {
        long start = System.nanoTime();
        BufferedReader br = new BufferedReader(new FileReader("deleteme.txt"));
        String line;
        Pattern splitSpace = Pattern.compile(" ");
        while ((line = br.readLine()) != null) {
            String[] words = splitSpace.split(line, 0);
        }
        br.close();
        long time = System.nanoTime() - start;
        System.out.printf("Took %f seconds to read lines and split (precompiled)%n", time / 1e9);
    }
    {
        long start = System.nanoTime();
        BufferedReader br = new BufferedReader(new FileReader("deleteme.txt"));
        String line;
        List<String> words = new ArrayList<String>();
        while ((line = br.readLine()) != null) {
            words.clear();
            int pos = 0, end;
            while ((end = line.indexOf(' ', pos)) >= 0) {
                words.add(line.substring(pos, end));
                pos = end + 1;
            }
            // words.
            //System.out.println(words);
        }
        br.close();
        long time = System.nanoTime() - start;
        System.out.printf("Took %f seconds to read lines and break using indexOf%n", time / 1e9);
    }
}

prints

Took 1.757984 seconds to write
Took 1.158652 seconds to read text as fast as possible
Took 6.671587 seconds to read lines and split
Took 4.210100 seconds to read lines and split (precompiled)
Took 1.642296 seconds to read lines and break using indexOf

So it appears that splitting the string yourself is an improvement and gets you close to treading text as fast as you can. The only way to read it faster is to treat the file as binary/ASCII-7. ;)

Peter Lawrey
  • 525,659
  • 79
  • 751
  • 1,130
  • 1
    It makes sense that split is slow, since it does regular expression searches. I wonder if it even caches the regex? Might be recompiling it each time. – Daniel Lubarov May 03 '11 at 08:36
  • @Daniel, Pre-compiling the regex does help. I have updated the answer. – Peter Lawrey May 03 '11 at 08:53
  • I guess I just misused the word interning thinking that every distinct String automagically went into a pool by interning. Now that you tell me that the reading and parsing part shouldn't take long, I'll try to separate the "in-memory table" code from the rest and see where the RAM goes ... – AndrewBourgeois May 03 '11 at 09:20
  • 1
    @IDemmel, You might want to read the Javadoc for String.intern(). String literals are automagically interned. ;) – Peter Lawrey May 03 '11 at 09:28