I am trying to read data from a snowflake resultset and construct a line by appending all the columns as per my requirement. I have 1300 columns to read from the resultset. I wrote the below approach.
int available = -1;
while (rs.next()) {
for ( int i=1; i <= 1300; i++) {
String value = rs.getString(i);
if(value == null)
value = "";
row.append(value).append("\t");
}
byte[] line = (row.toString().trim()+"\n").getBytes(StandardCharsets.UTF_8);
available = line.length;
bufferedOutputStream.write(line, 0, available);
row.setLength(0);
}
bufferedOutputStream.flush();
bufferedOutputStream.close();
This works till 100K records but fails when I tried to write 1 Million records throwing,
Java.lang.OutOfMemoryError: null
at java.base/java.io.ByteArrayOutputStream.hugeCapacity(ByteArrayOutputStream.java:125)
at java.base/java.io.ByteArrayOutputStream.grow(ByteArrayOutputStream.java:119)
at java.base/java.io.ByteArrayOutputStream.ensureCapacity(ByteArrayOutputStream.java:95)
at java.base/java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:156)
at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:81)
at java.base/java.io.BufferedOutputStream.write(BufferedOutputStream.java:127)
Please correct me If I am wrong above. Basically I need to write 10 Million records without having memory issues. Need some help here, Thanks in advance.
Edit: I am writing this into ByteArrayOutputStream to read from ByteArrayInputStream as InputStream. From this input stream will read the data for a validation process.