5

I am trying to create a Lambda function which will connect to RedShift :

var pg = require('pg');
var conString = 'postgresql://username:Password@JDBC-URL';
var client = new pg.Client(conString);
client.connect(function(err) { if(err) {
        console.log('could not connect to redshift', err);
    }
    });
pgClient.end();

But I am getting this exception :

Unable to import module 'index': Error 
at Function.Module._resolve Filename (module.js:325:15) 
at Function.Module._load (module.js:276:25) 
at Module.require (module.js:353:17) 
at require (internal/module.js:12:17) 
at Object.<anonymous> (/var/task/index.js:1:72) 
at Module._compile (module.js:409:26) 
at Object.Module._extensions..js (module.js:416:10) 
at Module.load (module.js:343:32) 
at Function.Module._load (module.js:300:12) 
at Module.require (module.js:353:17)

Can someone please help me with this.

Thanks.

Asish
  • 409
  • 2
  • 4
  • 17

2 Answers2

3

If your goal is to push data from AWS Lambda into Amazon Redshift, you could use the AWS Lambda Redshift Loader.

See: A Zero-Administration Amazon Redshift Database Loader

John Rotenstein
  • 241,921
  • 22
  • 380
  • 470
3

Another successful approach for loading data into Amazon Redshift from Lambda can be via kinesis firehose[1] which internally can keep data in s3, which inturn is recommended way to load data to redshift instead of insert commands.[2]

Data flow: Lambda > Firehose (s3) > Redshift.

Further reading suggestions for those who uses this way to save time (even though it is troubleshooting guideline, it can save tonnes of time if read beforehand): https://stackoverflow.com/a/34221861/2406687

Footnotes:

[1] "A COPY command is the most efficient way to load a table. You can also add data to your tables using INSERT commands, though it is much less efficient than using COPY" on http://docs.aws.amazon.com/redshift/latest/dg/t_Loading_data.html.

[2] http://docs.aws.amazon.com/firehose/latest/dev/what-is-this-service.html

Hitesh
  • 147
  • 2
  • 16