0

I'm a noob to Java so may be missing something obvious but I have a function which works for files files in the 200-300k range without issue but once I get to 1.4mb it falls over silently!

Here's the code:

private String readOutputFile(String filename) {
  if (filename == null) {
    return null;
  }
  File file = new File(filename);
  FileInputStream fis = null;
  String fileContent = "";
  this.logger.log("Reading " + filename + " from filesystem.");
  try {
    fis = new FileInputStream(file);
    System.out.println("Total file size to read (in bytes) : " + fis.available());
    int content;
    while ((content = fis.read()) != -1) {
      fileContent += (char) content;
    }

  } catch (IOException e) {
    this.logger.log("IO Problem reading ITMS output file\n");
    e.printStackTrace();
    throw new Error("io-error/itms-output");
  } finally {
    try {
      if (fis != null)
        fis.close();
    } catch (IOException ex) {
      this.logger.log("IO Problem reading and/or closing ITMS output file\n");
      ex.printStackTrace();
      throw new Error("io-error/finally-block");
    }
  }
  this.logger.log("File content has been read in");
  String compressed = this.compress(this.cleanXML(fileContent));
  this.logger.log("The compressed file size is :" + compressed.length() + " bytes");

  return compressed;
}

When it hits the size threshold which creates it to fail, it seems to stay within the while loop or at least that's my assumption because while it does report to the console the "Total file size to read ..." it never reaches the "File content has been read in" logging.

ken
  • 8,763
  • 11
  • 72
  • 133
  • 1
    Could be running out of memory - that's a lot of strings being created with `fileContent += (char) content;`. Try just reading in the whole file with `Files.readAllLines`? – Evan Knowles Mar 15 '18 at 06:22
  • will try in morning, thanks for the suggestion. – ken Mar 15 '18 at 06:26
  • 1
    check this link.
    https://stackoverflow.com/questions/625420/what-is-the-fastest-way-to-read-a-large-number-of-small-files-into-memory
    – Raj5198 Mar 15 '18 at 06:26

1 Answers1

1

You are creating many temporary String objects by performing character concatenation in your loop. I would use a StringBuilder. I would also prefer a try-with-resources. And if at all possible, I would prefer to stream from the InputStream to the OutputStream directly (instead of reading this entirely into memory). Anyway, based on what is here,

private String readOutputFile(String filename) {
    if (filename == null) {
        return null;
    }
    File file = new File(filename);
    StringBuilder sb = new StringBuilder();
    this.logger.log("Reading " + filename + " from filesystem.");
    try (FileInputStream fis = new FileInputStream(file)) {
        System.out.println("Total file size to read (in bytes) : " + fis.available());
        int content;
        while ((content = fis.read()) != -1) {
            sb.append((char) content);
        }
    } catch (IOException e) {
        this.logger.log("IO Problem reading ITMS output file\n");
        e.printStackTrace();
        throw new Error("io-error/itms-output");
    }
    this.logger.log("File content has been read in");
    String compressed = this.compress(this.cleanXML(sb.toString()));
    this.logger.log("The compressed file size is : " + compressed.length() + " bytes");

    return compressed;
}
Elliott Frisch
  • 198,278
  • 20
  • 158
  • 249
  • sorry to ask the dumb question but what is `try-with-resources`? – ken Mar 15 '18 at 18:10
  • as for the elegance of streaming from input to output ... I certainly like the idea of that but I felt it was above my Java skills atm, especially because I need to compress the stream in-between as well. – ken Mar 15 '18 at 18:11
  • i have tested this code and it appears to be working. I will mark this as correct but if had any examples you could point me toward in the more elegant camp I'd be very appreciative. Thanks either way. – ken Mar 15 '18 at 18:20