0

I am trying to load some test data to a table which has a column called mytext which has the type TEXT. I have a text format novel file which is 3MB in size. I am wondering if there is any easy and proper way to load the string line by line to the column mytext instead of looping it in a for loop from my client language. (each line per row).

Basically:

insert into table (column) values('a line of the text file')

Edited

(1) My question is marked as a duplicate of loading csv formatted file. I need to edit it. I don't think it's a duplicate because I cannot use the solution mentioned in that post and my file is not a csv formatted file. It's simply just a novel, and I only need load data to one column too.

(2) I just want to load a novel data to a column and then I can test some LIKE, FTS features.

sgon00
  • 4,879
  • 1
  • 43
  • 59
  • Possible duplicate of [How to import CSV file data into a PostgreSQL table?](https://stackoverflow.com/questions/2987433/how-to-import-csv-file-data-into-a-postgresql-table) – Tim Biegeleisen Jun 23 '19 at 02:56
  • @TimBiegeleisen hi, thanks a lot for the reply. I have edited the question. You can have a look. I can not use the solution in that post and my file is not a csv file too. Thanks. – sgon00 Jun 23 '19 at 03:16
  • The "possible duplicate" Tim mentions does talk about CSV files, but the same COPY command also processes TEXT files. See Postgres documentation at https://www.postgresql.org/docs/current/sql-copy.html – Belayer Jun 23 '19 at 04:20
  • @Belayer first, thanks for the comment. But I read that page and failed to find a way to achieve what I want. I did try COPY and \COPY command. But it failed to load the data line by line to one column. I mean each line per row. – sgon00 Jun 23 '19 at 05:45

2 Answers2

3

(Make a copy first for the file)

Here is a possibility

Load the file into vi

Issue the command to escape quotes

:1,$s/'/''/g

Issue the command to insert the insert statement at the start

:1,$s/^/insert into table values('/

Issue the command to insert the trailing stuff

:1,$s/$/');/

Write the file

:w

Quit vi

:q

Now you have the SQL required. Run that

Ed Heal
  • 59,252
  • 17
  • 87
  • 127
  • Thanks a lot for the answer. It works. Btw missing `'` for `('` and `')`. – sgon00 Jun 23 '19 at 06:32
  • I feeling that all of that could be done by `sed` which is significant faster then `vi(m)` And `copy` command is faster also (for complex and bad formatted data `copy ... from program ...` could be used) PS: instead of `:w :q` `:x` could be used – Abelisto Jun 23 '19 at 10:05
  • @Abelisto yeah, `sed` can be used. I am not a `sed` user, and I may search how to do the same in `sed` later. Btw if you think `\copy` can be used in psql, can you provide a working command as a solution? I am glad to see a `\copy` solution. So far, I can not achieve what I want. Recall that I want to add each line per row to a column. `insert into table (column) values('a line of the text file')`. – sgon00 Jun 25 '19 at 03:33
  • @Abelisto hi, I posted a `sed` version suggested by you as an answer. Just FYI. – sgon00 Jun 25 '19 at 07:15
1

Based on the vim answer from Ed Heal, there is a sed version on MacOS:

sed "s/'/''/g" novel.txt > tmp1.sql

sed "s/^/insert into table (column) values('/" tmp1.sql > tmp2.sql

sed "s/$/');/" tmp2.sql > tmp3.sql

rm tmp1.sql tmp2.sql && mv -i tmp3.sql final.sql

psql -U user dbname < final.sql &> /dev/null

To combine all sed patterns to one line:

sed "s/'/''/g; s/^/insert into table (column) values('/; s/$/');/" novel.txt > final.sql
sgon00
  • 4,879
  • 1
  • 43
  • 59