The first idea that comes to my mind is to dump your tables calling pg_dump command, similar to the approach presented here (but google is plenty of alternatives).
However, since your backup strategy requires you to select precise dates and not only tables, you will probably have to rely on a sequence of queries, and then my advise is to use a library like Psycopg.
EDIT:
I cannot provide a complete example since I don't know:
- which tables do you want to dump
- what is the precise backup strategy for each table (i.e. the SELECT statement)
- how you want to restore them. By deleting the table and then re-creating it, by overwriting db rows basing on an ID attribute, ...
the following example generates a file that stores the result of a single query.
import psycopg
conn = psycopg2.connect("dbname=test user=postgres") # change this according to your RDBMS configuration
cursor = conn.cursor()
table_name='YOUR_TABLE_HERE' # place your table name here
with open("table_dump.sql") as f:
cursor.execute("SELECT * FROM %s" % (table_name)) # change the query according to your needs
column_names = []
columns_descr = cursor.description
for c in columns_descr:
column_names.append(c[0])
insert_prefix = 'INSERT INTO %s (%s) VALUES ' % (table_name, ', '.join(column_names))
rows = cursor.fetchall()
for row in rows:
row_data = []
for rd in row:
if rd is None:
row_data.append('NULL')
elif isinstance(rd, datetime.datetime):
row_data.append("'%s'" % (rd.strftime('%Y-%m-%d %H:%M:%S') ))
else:
row_data.append(repr(rd))
f.write('%s (%s);\n' % (insert_prefix, ', '.join(row_data))) # this is the text that will be put in the SQL file. You can change it if you wish.