0

I have my database hosted on heroku, and I want to download specific parts of the database (e.g. all the rows with id > x from table 1, all the rows with name = x from table 2, etc.) in a single file.

From some research and asking a question here it seems that some kind of modified pg_dump would solve my problem. However, I won't be able to use pg_dump because I won't have access to the command line (basically I want to be able to click a button in my web app and it will generate + download the database file).

So my new strategy is to use the postgres copy command. I'll go through the various tables in my server database, run COPY (Select * FROM ... WHERE ...) TO filename , where filename is just a temporary file that I will download when complete.

The issue is that this filename file will just have the rows, so I can't just turn around and import it into pgadmin. Assuming I have an 'empty' database set up (the schema, indices, and stuff are all already set up), is there a way I can format my filename file so that it can be easily imported into a postgres db?

Community
  • 1
  • 1
Robert
  • 681
  • 1
  • 6
  • 19
  • I think you'll need to create an empty DB schema using your app, since it sounds like you're using schema generation. Then do row by row imports. – Craig Ringer May 27 '15 at 06:21
  • you can "import" data from that "filename" by running COPY TABLENAME from 'filename'... keep in mind while "exporting" that if you use one "filename" it will be overwritten by EACH copy statement – Vao Tsun May 27 '15 at 09:21
  • Another hiccup is that `filename` is resolved *on the server*; you will not have access to that filesystem. PostgreSQL does support copy `to stdout`; some drivers (e.g., Python's `psycopg2`) give you access to that. For example: `stmt.copy_to(sys.stdout, "(select * from my_table)")` or `stmt.copy_expert("copy (select * from my_table) to stdout csv header", sys.stdout)` – bimsapi May 27 '15 at 14:08

1 Answers1

0

Building on my comment about to/from stdout/stdin, and answering the actual question about including multiple tables in one file; you can construct the output file to interleave copy ... from stdin with actual data and load it via psql. For example, psql will support input files that look like this:

copy my_table (col1, col2, col3) from stdin;
foo    bar    baz
fizz   buzz   bizz
\.

(Note the trailing \. and that the separators should be tabs; you could also specify the delimiter option in the copy command).

psql will treat everything between the ';' and '.' as stdin. This essentially emulates what pg_dump does when you export table data and no schema (e.g., pg_dump -a -t my_table).

The resulting load could be as simple as psql mydb < output.dump.

bimsapi
  • 4,985
  • 2
  • 19
  • 27