4

Hi,

What am I trying to do ?

I am currently working on an ESB project (apache-camel + spring boot 2) where i read a MySQL table with more than 100 000 000 rows. I empty this table 1 row at a time, transform the row and send it to another database.

How am I doing this ?

Currently I use camel-sql to read the data

//edited .from(sql:SELECT * FROM mytable?outputType=StreamList&outpuClass=MyClass) .split(body()).streaming() .bean(mybean, "transform") .end()

Problem :

As I can't make a select * and get all 100M rows in my RAM because it's not possibly big enough, I thougth about using streams.

Therefore: It seems that using the StreamList as outputType still gets all rows at first and only then returns it as a "stream" (ResultSet).

Question

Can't we just use the property of PreparedStatement to really stream data from my database "one row at a time" rather than getting all of it at once and destroy my VM memory ?

Thanks.

  • That does sound wrong. Skimming the documentation, do you need to use a [splitter](http://camel.apache.org/splitter.html) too to get it streamed one row at a time, or would it still load all the data first? But if you can't solve this then it's probably worth raising as a bug against Camel. – Rup Sep 06 '18 at 09:35
  • I am using a splitter. I just edited the code part to add my splitter logic. – Benjamin Coquard Sep 06 '18 at 09:45
  • I think the documentation is a little misleading - without polling or recursive limit-based select statement, there's no way for camel to "stream" rows directly from the database. The way it is now, all the results will be loaded from the db and then individual rows will be released from the splitter. As far as I know - Spring's JdbcTemplate doesn't stream (and this is what camel's SQL component is based on). Use limits with your query and iteratively retrieve the results. That or poll – kolossus Sep 07 '18 at 03:48
  • 1
    Hi, thanks for the answers. That's what I am going to do LIMIT based statements. But I am a little sad that we can't stream a database result. Thanks :). – Benjamin Coquard Sep 13 '18 at 13:32
  • Hi, can you share your limit based query? – Kaif Khan Mar 23 '22 at 09:40

0 Answers0