0

I have been searching around stack overflow for some relevant problems, but I did not find any.

I have a table in sql on this format (call this file for create_table.sql):

CREATE TABLE object (
id                BIGSERIAL PRIMARY KEY,
name_c            VARCHAR(10) NOT NULL,
create_timestamp  TIMESTAMP   NOT NULL,
change_timestamp  TIMESTAMP   NOT NULL,
full_id           VARCHAR(10),
mod               VARCHAR(10) NOT NULL CONSTRAINT mod_enum CHECK (mod IN ('original', 'old', 'delete')),
status            VARCHAR(10) NOT NULL CONSTRAINT status_enum CHECK (status IN ('temp', 'good', 'bad')),
vers              VARCHAR(10) NOT NULL REFERENCES vers (full_id),
frame_id          BIGINT      NOT NULL REFERENCES frame (id),
name              VARCHAR(10),
definition        VARCHAR(10),
order_ref         BIGINT REFERENCES order_ref (id),
UNIQUE (id, name_c)
);

This table is stored in google cloud. I have about 200000 insert statement, where I use a "insert block" method. Look like this (call this file for object_name.sql):

INSERT INTO object(
name,
create_timestamp,
change_timestamp,
full_id,
mod,
status,
vers,
frame_id,
name)
VALUES
('Element', current_timestamp, current_timestamp, 'Element:1', 'current', 'temp', 'V1', (SELECT id FROM frame WHERE frame_id='Frame:data'), 'Description to element 1'),
('Element', current_timestamp, current_timestamp, 'Element:2', 'current', 'temp', 'V1', (SELECT id FROM frame WHERE frame_id='Frame:data'), 'Description to element 2'),
...
...
('Element', current_timestamp, current_timestamp, 'Element:200000', 'current', 'temp', 'V1', (SELECT id FROM frame WHERE frame_id='Frame:data'), 'Description to object 200000');

I have a bash script where a postgres command is used to upload the data in object_name.sql to the table in google cloud:

PGPASSWORD=password psql -d database --username username --port 1234 --host 11.111.111 << EOF
    BEGIN;
    \i object_name.sql
    COMMIT;
EOF

(Source: single transaction)

When I run this, I get this error:

BEGIN
psql:object_name.sql:60002: SSL SYSCALL error: EOF detected
psql:object_name.sql:60002: connection to server was lost

The current "solution" I have done now, is to chunk the file so each file can only have max 10000 insert statements. Running the psql command on these files works, but they take around 7 min.

Instead of having one file with 200000 insert statements, I divided them into 12 files where each file had max 10000 insert statements.

My question: 1. Is there a limit how large a file can contain? 2. I also saw this post about how to speed up insert, but I could not get COPY to work.

Hope someone out there have time to help me

longhh
  • 45
  • 7
  • Do you have a [statement_timeout](https://www.postgresql.org/docs/current/static/runtime-config-client.html#guc-statement-timeout) defined? –  Oct 26 '17 at 14:34
  • No, I have not defined a statement_timeout – longhh Oct 27 '17 at 05:15

0 Answers0