0

The code below can only insert 226 out of 20000 numbers from a supplied file into a number column of the TEMPTABLE, and then throws

[   java.sql.SQL.Exception:ORA-00604: error occurred at recursive SQL level 
1ORA-01000: maximum open cursors exceeded
ora-00604:  error occurred at recursive SQL level 1
ORA-01000: maximum open cursors exceeded
ORA-01000: maximum open cursors exceeded   ]

Here is the code:

private void ButtonloadActionPerformed(java.awt.event.ActionEvent evt) {                                                  
      PreparedStatement pre2;           
      //loading the database driver and initiating a connection to it
      //all statement in a try and catch block
      try {
          String driverName="oracle.jdbc.driver.OracleDriver";
                Class.forName(driverName);               
                String url ="jdbc:oracle:thin:@"+serverName+":"+serverport+":"+sid;
                conn =DriverManager.getConnection(url,username,password);  

// uploading the content on the csv file in the path into the TempTable in the DB
          try (BufferedReader br = new BufferedReader(new FileReader(path))) {
              String line;
               while ((line=br.readLine())!=null) {
                  String[] value=line.split(",");
                  String sql1 ="insert into TEMPTABLE(numbers)values('"+(value[0])+"')";
                  pre2=conn.prepareStatement(sql1);                  
                  pre2.executeUpdate();                                    
              }
              br.close();  // closing the buffered reader
              conn.close();   //closing database connection to free system resources 
          }
     }
     catch(ClassNotFoundException | SQLException | IOException e) {
         JOptionPane.showMessageDialog(null,e);
     }

Please can anyone help me fix this?

SusanW
  • 1,550
  • 1
  • 12
  • 22
  • 2
    You're not closing your `PreparedStatement`, which keeps the cursors open. The code is extremely poor quality otherwise too, and I suspect even if it did work, the performance would be horrible. I recommend reading some up to date tutorials on using JDBC, and then looking into batching. – Kayaman Sep 20 '16 at 18:05
  • 1
    For a start, learn how to parameterise your prepared statement with bind variables, so you only have to prepare (parse) it once, and then set the bound value in the loop. Batching would still be much more efficient, but at least then you wouldn't be killing your cache with 20,000 separate insert statements, and having the overhead of 20,000 hard parses.. – Alex Poole Sep 20 '16 at 18:12
  • @AlexPoole and kayaman You were adding your excellent suggestions as fast as I was typing an answer! Sorry! If either of you want to write an Answer, I'll upvote it (assuming it's any good :-) ). – SusanW Sep 20 '16 at 18:16
  • @kayaman and AlexPoole : Thank you for the observations. please in following your opinion, I would like to learn the improvement, can you help redraft this particular code to a higher quality code, including the batching and/or parameterising prepared statement with bind variables. the code worked fine after including pre2.close(); and could upload 200,000 data in one column at less than 50 seconds. any helping links for learning is welcomed. Thanks in-advance. – Alex lodeon Sep 21 '16 at 12:31
  • @Alexlodeon Have a look at [here](http://stackoverflow.com/questions/11389449/performance-of-mysql-insert-statements-in-java-batch-mode-prepared-statements-v) for some examples. It discusses MySQL, but it's the same for Oracle. Pay attention to how to set parameters to a `PreparedStatement` and how to execute inserts in batches. – Kayaman Sep 21 '16 at 12:39

2 Answers2

1

You aren't closing your PreparedStatements, and they each use cursor resources on your database.

You could add a pre2.close() after the executeUpdate() - that would fix the immediate issue. But it will be very slow and resource intensive - as noted elsewhere, you should look into batching and bind variables.

SusanW
  • 1,550
  • 1
  • 12
  • 22
0
 try (BufferedReader br = new BufferedReader(new FileReader(path))) {
  String sql1 ="insert into TEMPTABLE(numbers)values(?)";    
   pre2=conn.prepareStatement(sql1);                  
              String line;
                   while ((line=br.readLine())!=null) {
                      String[] value=line.split(",");
                    pre2.setString(1,value[0]); //if its number use setInt
                    pre2.addBatch();

                  }
              pre2.executeBatch();
pre2.close(0);
                   br.close();// closing the buffered reader
                  conn.close(); //closing database connection to free system resources 
              }
    }
    catch(ClassNotFoundException | SQLException | IOException e){
    JOptionPane.showMessageDialog(null,e);}

try adding batch.this will speed up your performance as well

Sahil Manchanda
  • 9,812
  • 4
  • 39
  • 89
  • Hi Sahil, Thank you . " pre2.close(0; " I guess was a typographical error – Alex lodeon Sep 21 '16 at 12:09
  • " pre2.close(0); " comes with an error , problem was solved with " pre2.close(); " . – Alex lodeon Sep 21 '16 at 12:51
  • I just noticed the batching, I will verify and give a thumbs up. thank you , please , I will welcome a link from you about batching . High Regards – Alex lodeon Sep 21 '16 at 12:58
  • batching was efficient but returns sql exception data too large for the column after it has inserted up to 144,700 numbers while the non batching code was holding the system resource at ransom but inserted up to 307,000 numbers but I had to stop the program . I think the batching would be perfect without the exception. no need for extra over head , inserting only in one column in the database and the CSV is a one column file with up to 400,000 data. also , pre2.close(0) does not work in the batching example... thanks – Alex lodeon Sep 21 '16 at 13:30