My requirement is to load a large excel file[with more than 160k rows of data and around 150 columns] placed on a file server into an oracle DB table. It has a couple of constraints though -
The position of two columns[ lets say 'EmpID' & 'AcctNum'] might vary on the excel - like at times 'EmpID' can occur at column 'A' but can also occur at column 'E at other times. Thus the data mapping has to be dynamic.
The file can come at any time of the day and should be processed within an hour of its creation in the file server and multiple files can come on a day. Thus I have to create some Batch in oracle [like running dtsx through SQL server jobs].
OS where the batch is to be run is Unix.
- Performance has to be a key challenge here, so kindly take it as a criteria to be performance effective.
Please advise on how we can do this[preferably using freeware/open-source tools]
Thanks & regards, Arka