I am using Ubuntu 16.04 xenial. And on ubuntu machine i installed BigsqlPostgres version 10. Which is installed in directories as follows:
BigsqlPostgres:
/etc/bigsql/pg10/bin/
/etc/bigsql/data/
I am using below scripts for taking a backup of database:
import os, boto3, pytz, sys
parent_dir = os.path.dirname(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
sys.path.append(parent_dir)
from sellerhub_midport.utils.variables import _POSTGRES_HOST, _POSTGRES_PORT, _POSTGRES_USERNAME, _POSTGRES_PASSWORD, _POSTGRES_DBNAME, _AWS_ACCESS_KEY_ID, _AWS_SECRET_KEY, _BUCKET_NAME
from datetime import datetime
class BackupDatabase(object):
"""docstring for BackupDatabase"""
def __init__(self):
super(BackupDatabase, self).__init__()
localtz = pytz.timezone('UTC')
self.filename = 'report'+datetime.now(localtz).strftime('%Y_%m_%d') + '.dump'
self.pathname = 'prod_backup/' + self.filename
self.backup_database()
def backup_database(self):
os.putenv('PGPASSWORD', _POSTGRES_PASSWORD)
os.system('/etc/bigsql/pg10/bin/pg_dump --format=c -h {} -p 5432 -d {} -U {} > {}'.format(_POSTGRES_HOST, _POSTGRES_DBNAME, _POSTGRES_USERNAME, self.filename))
boto3_res = boto3.resource(service_name='s3', aws_access_key_id=_AWS_ACCESS_KEY_ID, aws_secret_access_key=_AWS_SECRET_KEY,)
with open(self.filename, 'rb') as f_obj:
boto3_res.Bucket(_BUCKET_NAME).put_object(Key=self.pathname, Body=f_obj)
os.remove(self.filename)
BackupDatabase()
If i run above backup.py file in my django virtual environment it's create a backup file named with current timestamp.But when i write a cron job for the same it is not working, i mean not creating a backup of database but when i run this file manually its working fine. Below is cron which i write for taking backup of database.
10 14 * * * /usr/bin/python /home/ekodev/sellerhub_midport/sellerhub_midport/scripts/service_crons/backup_db.py > /tmp/listener.log 2>&1
I am not getting any log error. If i look into the /tmp/listener.log
its showing me system is ok only.
THANKS FOR YOUR PRECIOUS TIME.