I am trying to insert a large number of records from a CSV file to a table in Oracle DB, in a Linux environment. The insert is simple - the CSV has IDs that need to be inserted to a single column table, for further processing.
Following was my line of thought: 1. Read the file into an array 2. Loop through each item in the array and insert it into the table
I cannot use SQL *Loader or DBI, the access to the DB server is restricted and I am not allowed to install any extra packages on the host that I use to connect to the DB. So I need to use the simplest things available.
Here is the piece that works:
#!/usr/bin/perl
use warnings;
use strict;
my $returnVar;
my $inputfile = $ARGV[0];
# Read file into an array
open(FILE, "<", $inputfile) or die("Unable to open file: $inputfile");
my @idlist = <FILE>;
close(FILE);
#insert array into table
foreach my $id(@idlist)
{
chomp($id);
$returnVar = `sqlplus -s cvo/cvo123\@CVOSTAGE\ <<END
set heading off
set echo off
set feedback off
set pagesize 0
insert into tmp_tx_id values('$id');
exit`;
}
This is faulty in the sense that you need to open/close a connection every time you insert records. I need to insert a large number of records (> 1 million) and this, i agree, is a horrible way to do it. I tried to modify it in the following way, but am not successful:
.
.
<File has been read into the array>
$returnVar1 = `sqlplus -s cvo/cvo123\@CVOSTAGE\ <<END
set heading off
set echo off
set feedback off
set pagesize 0`;
foreach my $id(@idlist)
{
chomp($id);
$returnVar2 = `insert into tmp_tx_id values(\'$id\');`;
}
system("exit");
Apparently, the insert statement is not executed on the SQL Plus prompt - it goes to the command shell. This is the error I get:
sh: -c: line 0: syntax error near unexpected token `('
sh: -c: line 0: `insert into tmp_tx_id values('664436565');'
Is there any way I can execute multiple Insert statements, without having to open/close the connection each time? Any other suggestions(even outside Perl) are really welcome - I am a programmer in learning.
Thanks