0

I have a Django website running and i am looking for a way to run a script that constantly fetch some data from the Internet and insert them into my Django app database.

how can i keep script running as long as server is up ? (no need to start it manually - and don't want to stop it at all!!) i guess celery is not what i want !?!

,and how can i use Django ORM in this script for inserting data into db? i tried threading but think its should be started by calling the script or ..

Stick
  • 25
  • 5

1 Answers1

2

You can indeed use celery for this job and you can take full advantage of the ORM without any need to query your db manually. Make a task that will fetch the items from the internet and updates the DB. Define a celery task that will do what you want (fetching + saving)

 @app.task
 def fetching_script():
     response = requests.get(url)
     for item in response.content:
         instance = ModelInstance(**item)
         instance.full_clean()
         instance.save()

then in your celery config

  app.conf.update(
      CELERYBEAT_SCHEDULE={
          'fetch_and_save': {
             'task': 'path.to.your.fetching_script',
             'schedule': crontab(minute='*/15'),
          }
      }
  )

This will execute your task every 15 minutes for example, you will need to tweak the time based on your needs or how long the script will run. Another way to achieve this is making a standalone script that will do the job (using the DB wrapper for example) and put the script into the crontab assuming you're runnin a Linux distro on the server side

Mattia Procopio
  • 631
  • 8
  • 16