I am trying to solve producer consumer problem which is I/O intensive. Producer constants appends data to a file and consumer reads from this growing file. File size is usually in GB (around 10GB) .
Initially I tried BufferedOutputStream and BufferedInputStream to read and write data to file. It takes too much System CPU % like 30-40% (must be system calls for I/O) during bursts of data which comes at 9:30am.
Looking at Memory mapped files to make this faster.
File fileToRead= new File("C:\\readThisFile.dat");
FileChannel inChannel = new FileInputStream(fileToRead).getChannel();
MappedByteBuffer buffer = inChannel.map(FileChannel.MapMode.READ_ONLY, 0, inChannel.size());
byte[] data = new byte[(int)inChannel.size()];
buffer.get(data);
1) Since file readThisFile.dat size() is more than INTEGER.MAX length inChannel.map() throws an exception.
2) How can consumer constantly read data using memory mapped files of extremely large files. consumer can load may be 100MB each time and keep looking for more data?
3)Is there a faster solution like trying something other than memory mapped files in Java?