0

I need to upload large (2-20 GB) videos to a streaming service without loading them into the ram.The code i wrote is working but i don't have the resources to handle large files. Is there a way in python to that?

def upload(file):
    files = {'file1': open(file, 'rb',0)}
    r = requests.post(url,files=files)
    data = r.json()

    print(data["msg"])
    return data["result"]["id"]

Their API says i have to POST the file and it shall be multipart/form-data encoded Example with curl: curl -F file1=@/path/to/file.txt https://www.example.com/uls/jAZUhVzeU78

  • Sorry, my initial comment was wrong, you are using the `files` parameter. The [documentation covers this](https://2.python-requests.org/en/master/user/quickstart/#post-a-multipart-encoded-file): *In the event you are posting a very large file as a `multipart/form-data` request, you may want to stream the request. By default, requests does not support this, but there is a separate package which does - `requests-toolbelt`. You should read the [toolbelt’s documentation](https://toolbelt.readthedocs.io/en/latest/) for more details about how to use it.* – Martijn Pieters Jul 31 '19 at 11:23
  • The new duplicate target addresses this too. – Martijn Pieters Jul 31 '19 at 11:24
  • note that when sending things that large you should look for an API that allows you to resume transfers. the chance of network failure goes up as you transfer more data, as does the cost of retry. once you can resume transfers, you might be able to upload in, e.g., 10MB chunks which would obviate the original problem – Sam Mason Jul 31 '19 at 11:37
  • I tried the methods described in toolbelt's documentation, but that uses data= not files= so the files won't upload even though it starts and in the end the response is 200 "OK".I'm using verystream.com's API by the way. – Kovács Szabolcs Jul 31 '19 at 12:20

0 Answers0