0

I am creating a process using java runtime on a solaris OS. I then get inputstream from the process and do a read on the input stream. I expect (I am not too sure about the process, it is a 3rd party thing)the process outstream to be huge but it seems to be clipped. Could it be that there is a threshold on java side as to how much a process can have in its output stream?

Thanks, Abdul

Abdul Rahman
  • 1,294
  • 22
  • 41
  • Yes, there should be no problem trying to start the process from the command line and comparing the output to the one you receive in Java. – rolve Oct 24 '12 at 17:49

3 Answers3

1

There is no limit to the amount of data you can read, if you read repeatedly. You cannot read more than 2 GB at once and some stream types might only give you a few KB at a time. e.g. a slow Socket will often given you 1.5 KB or less (based on the MTU of the connection)

If you call int read(byte[]) it is only guaranteed to read 1 byte. It is a common mistake to assume you will read the full buffer every time. If you need this you can use DataInputStream.readFully(byte[])

Peter Lawrey
  • 525,659
  • 79
  • 751
  • 1,130
  • Is there any limitation is file size for DataInputStream.readFully function? – Aavik Nov 27 '17 at 14:54
  • @Aavik The file needs to have at least the `length` of the byte[] yet to read otherwise it will throw an IOException. Otherwise it can be any size. – Peter Lawrey Nov 27 '17 at 21:43
0

You shouldn't run into limitations on InputStream or OutputStream if it is properly implemented. The most likely resource to run into limitations on is memory when allocating objects either from the input or to the output - for example trying to read a 100GB file into memory to then write to an output. If you need to load very large objects into memory to or from a stream, make sure to use a 64bit JVM and allocate as much memory to it as you can, however testing is the only way to determine the ideal values.

doublesharp
  • 26,888
  • 6
  • 52
  • 73
  • You are correct, I tried to update my answer to make it more clear. I was intending to say that if you are allocating very large objects to write to an `OutputStream`, you may run into memory issues, not an actual limit on the stream. – doublesharp Oct 24 '12 at 18:08
0

By "process output stream" do you mean STDOUT? STDERR? Or you have an OutputStream object that you direct to somewhere? (a file?)

If you write to a file - you might see clipped data if you don't close your output stream. As long as you go by the book (outputstream.close() when you are done writing) you are good to go. Notice that there are some underlying limitations like Storage space (obvious) or file system limitations (some limit the file size).

If you write to STDOUT/STDERR - As far as I know you are fine. Notice again that if you write your output to a terminal, or through Eclipse (for example), then they might have a buffer and therefore limit your output (but then, it's most likely that you'll get the first part of data missing and not the last part of it).

Zach Moshe
  • 2,782
  • 4
  • 24
  • 40