So I am currently implementing my very first Django-based web app. However, I figured that I only need Django to perform a backend cron job to scrape data from a website and then update existing data in a PostgreSQL database. Then I just use a React frontend to retrieve data from the database and visualize it on the webpage.
My issue now is that I don't know how to conceptually tackle this challenge. Currently, I have a model in my models.py
file that successfully created my empty table in PostgreSQL:
from django.db import models
# Create your models here.
class rainAreas(models.Model):
Country = models.CharField(max_length=100)
HasRain= models.BooleanField()
Since = models.DateField()
class Meta:
app_label = "rain_areas"
I also filled the table manually with dummy data. Finally, I have a script in my admin.py
file, which successfully creates the desired list of data scraped from a website. It looks like this:
my_data = [{"country": "Germany", "HasRain": True, "Since": "2020-08-11"}, {"country": "France",....
But now I am stuck. What is the next move to
- make an SQL
UPDATE
on the table I created with the data I have inadmin.py
- transform this script to a cron job which should run every hour