3

I'm working with some databases and I need to print each database in files. My Server has 5 GB of RAM and I assigned 4GB just for tomcat server.

I made a simple resultset that consult all the rows of each database. Then I print each row in a file (.dat) (Obviously I do that separately)

I can work with an entire result set that has 5738095 rows

But if I want to work with a database with more than 7 millions of rows I got the error "Java heap space", I've started with 3GB and 2GB for tomcat but when I started working with more than 3 millions of rows I needed to add more RAM to my server, so my question here is, is it good to put more RAM to my server or how can I divide the result set and print million per million without get the error " java heap space".

I've been thinking in count the entire database and make something like limit 0 offset 1000000, limit 1000000 offset 1000001, but sincerely I'm lost. Thanks for reading and helping and sorry for my english.

There's some code

//numero columnas is the total of columns
 rs_datos =  StDatos.executeQuery("select * from table");
  while(rs_datos.next())
  {

    for(int i = 0; i < numeroColumnas; i++)
    {
      if(i+1 == numeroColumnas)
      {
       pw.print(rs_datos.getString(i+1));
      }
      else
      pw.print(rs_datos.getString(i+1) + "|");
    }

     pw.println("");
  }

  pw.close();
zickno
  • 115
  • 1
  • 3
  • 10
  • 3
    shouldn't matter how much (or little) ram you have. a properly written loop that simply funnels some DB results into a file should **NOT** be sucking up 4gig of ram, unless you're dealing with some huge blob records. – Marc B Jan 22 '15 at 18:12
  • 2
    which database you are using ? –  Jan 22 '15 at 18:13
  • mysql / JSP + JDBC . – zickno Jan 22 '15 at 18:22

5 Answers5

4

You do not have to read all rows into RAM to print them. Connect to DB, execute your select and print row-by row while iterating the result set. Million of rows is nothing, believe me.

You can obviously use paging as well, but in your case you probably even do not need this.

And the last point. I really do not understand why are you implementing such kind of DB export yourself. All datbases have such utilities ready-to-use. For example mysqldump for MySQL. Just find such utility that works with your DB and compose correct command line parameters.

AlexR
  • 114,158
  • 16
  • 130
  • 208
  • I did the select, then when I have the entire resultset I Printed into the file. I don't know how "properly" don't charge all the result set to the RAM, but thanks I will search for others utilities to do my work. – zickno Jan 22 '15 at 18:33
  • @zickno, you have not showed your code, so it is hard to help you. Show your code fragments and we will probably be able to give you a better advice. – AlexR Jan 22 '15 at 18:34
2

In your current setup, check how many rows are actually
loaded from the DB when you call the SQL query. See:

Statement.setFetchSize

You should try to utilize lazy loading, if your JDBC driver supports it.

See also:

Java JDBC Lazy-Loaded ResultSet

So I mean, what you intended to do with LIMIT and OFFSET,
the JDBC driver can already do it for you.

Community
  • 1
  • 1
peter.petrov
  • 38,363
  • 16
  • 94
  • 159
0

If what you need is to backup that specific table, just do a mysqldump from the command line:

mysqldump -u... -p... mydb mytable > my_backup.sql

Most likely your program is not efficient, so a simple workaround would be to use this method.

Multitut
  • 2,089
  • 7
  • 39
  • 63
-1

You can execute command from application or cron and use this for create a document in format csv Mysqldump in CSV format

Community
  • 1
  • 1
hfloresv
  • 54
  • 2
-1

Gimme points, friend.

mysqldump -u username -p db_name table1_name > dump.sql

MarsAtomic
  • 10,436
  • 5
  • 35
  • 56