3

I want to set up a dev environment of Hasura on my local machine, that replicates my existing production (same tables, same schema, same data).

  • What are the required steps to achieve this task?
  • use docker? copy DB structure/data? more details - read about backup in prostgresql? – xadm Feb 15 '20 at 13:16

3 Answers3

5

I've found this process to work well.

  1. Create a clean empty local postgresql database and Hasura instance. To update an existing local database, drop it and recreate it.

  2. Dump the schema and data from your existing Hasura server (as per the answer by @protob, but with clean_output set so that manual changes to the output do not have to be made. See pgdump for details.

    curl --location --request POST 'https://example.com/v1alpha1/pg_dump' \
      --header 'Content-Type: application/json' \
      --header 'X-Hasura-Role: admin' \
      --header 'Content-Type: text/plain' \
      --header 'x-hasura-admin-secret: {SECRET}' \
      --data-raw '{ "opts": ["-O", "-x","--inserts",  "--schema", "public"], "clean_output": true}' > hasura-db.sql
    
  3. Import the schema and data locally:

    psql -h localhost -U postgres < hasura-db.sql
    
  4. The local database has all the migrations because we copied the latest schema, so just mark them as applied:

    # A simple `hasura migrate apply --skip-execution` may work too!
    for x in $(hasura migrate status | grep "Not Present" | awk '{ print $1 }'); do
      hasura migrate apply --version $x --skip-execution
    done
    
    # and confirm the updated status
    hasura migrate status
    
  5. Now finally apply the Hasura metadata using the hasura CLI:

    hasura metadata apply
    

Enjoy your new instance!

Raman
  • 17,606
  • 5
  • 95
  • 112
1
  1. Backup the database.
  2. Run Hasura with the database.
  3. Make sure Hasura metadata is synced.
jjangga
  • 441
  • 5
  • 14
1

Hasura has a special endpoint for executing pg_dump on the Postgres instance.

Here is a sample CURL request:

curl --location --request POST 'https://your-remote-hasura.com/v1alpha1/pg_dump' \ --header 'Content-Type: application/json' \ --header 'X-Hasura-Role: admin' \ --header 'Content-Type: text/plain' \ --data-raw '{ "opts": ["-O", "-x","--inserts", "--schema", "public"] }'

It outputs the schema and data in psql format.

You can use a tool such as Postman for convenience to import, test and run the CURL query.

Please follow the pg_dump documentation to adjust needed opts.

i.e. the above query uses "--inserts" opt, which produces "INSERT INTO" statements in the output.

The output can be copied, pasted and imported directly to Hasura Panel SQL Tab ("COPY FROM stdin" statements result in errors when inserted in the panel).

http://localhost:8080/console/data/sql

Before import, comment out or delete the line CREATE SCHEMA public; from query, because it already exists.

You also have to select tables and relations to be tracked, during or after executing the query.

If the amout of data is bigger, it might be better to use CLI for import.

protob
  • 3,317
  • 1
  • 8
  • 19