We have a linux box into which some third party tool drops 0.5MB of data and we have about 32000 similar files. We need to process those files and insert into Oracle10G DB. some one in our organization already has already created a Java program and it is running as a Daemon thread with static fields to map the data in the file and save data into db and clear the static fields for the next line.
This is a serial processing of file and it seems so slow. I'm planning to make this multithreaded by getting rid of it, Or, run multiple java processes(same jar but each one will be start with java -jar run.jar) for parallel execution. But, I'm concerned about the data locking etc., issues.
Questions is what is the best way to bulk load the data into the DB using Java? Or any other way.
Update: the data that we work on is in the following format, we process the below lines, to make entries into db.
x.y.1.a.2.c.3.b = 12 // ID 1 of table A onetomany table C 3 ID sequence and its proprty b =12
x.y.1.a.2.c.3.f = 143 // ID 1 of table A onetomany table C 3 ID sequence and its proprty f =143
x.y.2.a.1.c.1.d = 12
Update: We have about 15 tables that take this data. Data is in blocks, each block has related data, and related data will be processed at a time. So you are looking at the following figures when inserting one block
Table 1 | Table 2 | Table 3
---------------------------
5 rows | 8 rows | 12 rows
etc.,