I'm making a Python program that visualizes Covid vaccination data on a world map using Our World In Data's vaccination data in .json format. I'd like to introduce a feature where the program downloads the latest .json file from OWID's Github and replaces the old one, provided that there is at least a 24 hour difference between the 'last modified' dates of the two files.
My question is whether I can instead harness Git/Github's power to quickly compare the contents of the local file and remote file and only download whatever's different between the two, in order to cut down on the size of whatever has to be downloaded. The end goal is using as little bandwidth/time for downloading a fresh version of the file as possible.