Am I doing something wrong here?
Well, some things wrong, and other things far from ideal.
- You've got a variable of type
BufferedReader
called fileReader
. That's confusing to say the least.
- You're using
FileReader
which is a generally bad idea as it always uses the platform default encoding
- You're only reading while
ready()
returns true
. That just returns whether or not the next read will block - it may be okay for files, but it's definitely not a good idea in general. You should read until the next call indicates that you've exhausted the stream.
- You're reading a character at a time, which is somewhat inefficient - there's no need to make one call per character rather than using the overload of
read
which takes a character array, allowing bulk transfer.
- You're using string concatenation to build up the file, which is also very inefficient.
- There's no indication that you're closing the reader. Maybe that's in code which you haven't posted...
- You've got two levels of
try
block for no obvious reason, and your IOException
handling is almost always the wrong approach - you should very rarely swallow exceptions (even after logging) and then continue as if nothing happened.
If you possibly can, avoid writing this code completely - use Guava instead:
// Use the appropriate encoding for the file, of course.
String text = Files.toString(new File(args[2]), Charsets.UTF_8);
Now of course, you may well find that you still see the same result - maybe the file has "\r"
as the line break and you're on a system which only interprets that as "return to start of line" so each line is overwriting the previous one. We can't really tell that - but once you've replaced your code with a single call to Files.toString()
, it'll be easier to diagnose that.