I am using boto3 to upload massive media files (2GB+) from my Django website to an s3 storage bucket. My issue is that when uploading a file larger than 2.5MB - the connection is immediately timed out and no debug information displays
- I assume what is happening is that Django is using the temporary file upload handler, but the temp file handler won't work on the server the Django app is running on (pythonanywhere free tier).
- This works flawlessly locally, because it just has to copy to the OS, not the server. I would like to skip the web server so I am not using all the storage/bandwidth
s3.upload_file
needs the local path of the file in order to be able to upload it to S3, the only way I could find to do that is grab the temp_path of the file.
# method to chunk the upload process if file is over 2.4MB
def s3_multipart_upload(f, video_id):
# our server path
file_path = "media/"
new_folder = str(video_id)+"/"
file_name = f.name
server_path = file_path + new_folder + file_name
# the file gets copied to ~/tmp/ so grab that path
local_path = f.temporary_file_path()
# create s3 client connection
s3 = boto3.client('s3', settings.AWS_S3_REGION_NAME, **credentials)
config = TransferConfig(
multipart_threshold=1024 * 25,
max_concurrency=10,multipart_chunksize=1024 * 25,
use_threads=True
)
s3.upload_file(
local_path,
settings.AWS_STORAGE_BUCKET_NAME,
server_path,
ExtraArgs={'ACL': 'public-read'},
Config=config,
Callback = ProgressPercentage(local_path)
)
return
I would like to be able to upload directly to S3, straight from my website, without the files going through the server (python anywhere). If there is a cleaner way of doing this, a better server I should use, or if there is a way to upload the file without copying to temp first - I'm all ears.
(I am also new to Python/Django/Servers in general, so any help is appreciated)