0

I using postgres stream to insert record into postgres , for single column works fine , but what is ideal data format for copy for multiple columns

code snippets

  var sqlcopysyntax = 'COPY srt (starttime, endtime) FROM STDIN  delimiters E\'\\t\'';

        var stream = client.query(copyFrom(sqlcopysyntax));

        console.log(sqlcopysyntax)


        var interndataset = [
            ['1', '4'],
            ['6', '12.074'],
            ['13.138', '16.183'],
            ['17.226', '21.605'],
            ['22.606', '24.733'],
            ['24.816', '27.027'],
            ['31.657', '33.617'],
            ['34.66', '37.204'],
            ['37.287', '38.58'],
            ['39.456', '43.669'],
            ['43.752', '47.297'],
            ['47.381', '49.55'],


        ];

        var started = false;
        var internmap = through2.obj(function(arr, enc, cb) {
/* updated this part by solution provided by @VaoTsun */            
var rowText = arr.map(function(item) { return (item.join('\t') + '\n') }).join('') 
                started = true;
                //console.log(rowText)
                rowText=rowText+'\\\.';
 /* end here*/
            started = true;

            cb(null, rowText);
        })

        internmap.write(interndataset);
        internmap.end();

        internmap.pipe(stream);

wherein i got error: (due to delimiter)missing data for column "endtime"(resolved) but got below error

error: end-of-copy marker corrupt

enter image description here

COPY intern (starttime, endtime) FROM STDIN
1                       4
6                       12.074
13.138                  16.183
17.226                  21.605
22.606                  24.733
24.816                  27.027
31.657                  33.617
34.66                   37.204
37.287                  38.58
39.456                  43.669
43.752                  47.297
47.381                  49.55

any pointer on how to resolve this . what would be ideal format for multiple column inserts using copy command

Rizwan Patel
  • 538
  • 2
  • 9
  • 27
  • just try using other delimiter then chr(9) ?.. – Vao Tsun Jul 31 '17 at 11:21
  • when copying from `stdin` you need `\.` on a line by itself to end the data. – Jasen Jul 31 '17 at 11:29
  • could you please have a look at it ! https://github.com/brianc/node-pg-copy-streams/issues/65#issuecomment-319035793 – Rizwan Patel Jul 31 '17 at 11:45
  • giving delimiters `var sqlcopysyntax = 'COPY srt (starttime, endtime) FROM STDIN with delimiter \'\\t\' null as ';` gives error "**syntax error at end of input**" – Rizwan Patel Jul 31 '17 at 12:00
  • @Jasen updated my question ! – Rizwan Patel Jul 31 '17 at 12:44
  • @VaoTsun updated my question – Rizwan Patel Jul 31 '17 at 12:44
  • try `var rowText = arr.map(function(item) { return (item.join('\t') + '\n') }).join('') + '\\.';` – Vao Tsun Jul 31 '17 at 13:05
  • getting 'end-of-copy marker corrupt' error – Rizwan Patel Jul 31 '17 at 13:13
  • [This approach is the most efficient one](https://stackoverflow.com/questions/37300997/multi-row-insert-with-pg-promise), as it gives you full control over the data being imported, along with very high performance. The `COPY` approach has the same performance, but it gives you no flexibility in controlling/converting the data. – vitaly-t Jul 31 '17 at 23:41
  • @vitaly-t ng-promise is great , but i need to bulk insert *stream* into postgres which i don't think is available in ng-promise – Rizwan Patel Aug 01 '17 at 08:26
  • @RizwanPatel it is available with `pg-promise` (not `ng-promise`) :) See [Data Imports](https://github.com/vitaly-t/pg-promise/wiki/Data-Imports). – vitaly-t Aug 01 '17 at 08:46
  • woah thanks ! @vitaly-t so basically under the hood *massive insert* streaming is happening i assume ? – Rizwan Patel Aug 01 '17 at 09:25
  • @RizwanPatel it works through an adapter, as shown in the example - `The key element here is a function that on each call will fetch the next set of data rows from its source, either in a sequential manner (like from a stream), or using a paging logic (from an index):` ;) So it is not under the hood, it is quite explicit, but yes, the data can come from the stream, in batches, i.e. you read, say 1000 records at a time and insert them all at once, and keep going till no data left. – vitaly-t Aug 01 '17 at 10:43
  • Let us [continue this discussion in chat](http://chat.stackoverflow.com/rooms/150712/discussion-between-rizwan-patel-and-vitaly-t). – Rizwan Patel Aug 01 '17 at 10:53
  • @vitaly-t require some pointers, suggestions on how to implement pg-promise *massive insert* in below code https://gist.github.com/matt212/9aceee8d314b2a533c3598806ed06d74 – Rizwan Patel Aug 01 '17 at 11:13

2 Answers2

2

With immense help from @jeromew from github community.

and proper implementation of node-pg-copy-streams(takes away copy command complexity ). we were able to solve this issue

https://github.com/brianc/node-pg-copy-streams/issues/65 below is working code snippets

var sqlcopysyntax = 'COPY srt (starttime, endtime) FROM STDIN  ;

    var stream = client.query(copyFrom(sqlcopysyntax));

    console.log(sqlcopysyntax)


    var interndataset = [
        ['1', '4'],
        ['6', '12.074'],
        ['13.138', '16.183'],
        ['17.226', '21.605'],
        ['22.606', '24.733'],
        ['24.816', '27.027'],
        ['31.657', '33.617'],
        ['34.66', '37.204'],
        ['37.287', '38.58'],
        ['39.456', '43.669'],
        ['43.752', '47.297'],
        ['47.381', '49.55'],


    ];

    var started = false;
        var internmap = through2.obj(function(arr, enc, cb) {
            var rowText = (started ? '\n' : '') + arr.join('\t');
            started = true;

            cb(null, rowText);
        })

        data.forEach(function(r) {
            internmap.write(r);
        })

    internmap.end();

    internmap.pipe(stream);
Rizwan Patel
  • 538
  • 2
  • 9
  • 27
0

https://www.postgresql.org/docs/current/static/sql-copy.html

DELIMITER

Specifies the character that separates columns within each row (line) of the file. The default is a tab character in text format, a comma in CSV format. This must be a single one-byte character. This option is not allowed when using binary format.

try using not default delimiter (as tabulation can be replaced on copy/paste), eg:

t=# create table intern(starttime float,endtime float);
CREATE TABLE
t=# \! cat 1
COPY intern(starttime,endtime) FROM STDIN delimiter ';';
1;4
6;12.074
13.138;16.183
17.226;21.605
22.606;24.733
24.816;27.027
31.657;33.617
34.66;37.204
37.287;38.58
39.456;43.669
43.752;47.297
47.381;49.55
49.633;54.68
54.763;58.225
59.142;62.98
64.189;68.861
69.82;71.613
72.364;76.201
76.285;78.787
78.871;81.832
\.

t=# \i 1
COPY 20

Also in your question you lack \., try typing in psql - you will see instructions:

t=# COPY intern(starttime,endtime) FROM STDIN delimiter ';';
Enter data to be copied followed by a newline.
End with a backslash and a period on a line by itself.

End with a backslash and a period on a line by itself.

Vao Tsun
  • 47,234
  • 13
  • 100
  • 132
  • could you please refer issue https://github.com/brianc/node-pg-copy-streams/issues/65#issuecomment-319035793 – Rizwan Patel Jul 31 '17 at 11:40
  • so you use node-pg-copy?.. but your syntax does not look like – Vao Tsun Jul 31 '17 at 11:45
  • you can view the whole issue here. what would be proper syntax any pointers ? https://github.com/brianc/node-pg-copy-streams/issues/65 – Rizwan Patel Jul 31 '17 at 11:49
  • for single insert it works as expected , but for multiple column how to give delimiters `var sqlcopysyntax = 'COPY srt (starttime, endtime) FROM STDIN with delimiter \\t\ '; ` gives error any suggestions – Rizwan Patel Jul 31 '17 at 11:58