The question is about automating the download of data from an authenticated Django website from a Linux core server. Being able to do it with a python script will be great.
Background of the question
The Challenge Data website is a site proposing Data Sciences challenges. This website is written in Django.
The 18-Owkin data challenge data challenge provides data of quite large size (over 10Gb). You need to be authenticated in order to download the data.
I'm able to download the files from my Windows 10 laptop after authentication to the website and clicking to a download link like y_train. Downloading starts automatically in that case.
However, I want to upload the data to a GPU Cloud Linux Core (without GUI) machine. I can do that from my laptop, but it is very slow as my up-going bandwidth is low.
Do you see a way to get the data directly from the Linux core server? That would mean:
- Authenticate to the (Django) website.
- Then connect to the "download URL". Will it perform the download?