Banging my head on a problem. I will caveat in advance that is not reproducible since I cannot share my end point. Also I work as a data scientist, so my knowledge of web technologies is limited.
from urllib.request import Request, urlopen
url = "https://www.some_endpoint.com/"
req = Request(
url, headers={"API-TOKEN": "some_token"})
json_string = "{"object": "XYZ".....}"
response = urlopen(req, json_string.encode("utf-8"))
I am getting unusual behavior on the urlopen. When my JSON is below 65536 bytes, as shown by evaluating len(json_string.encode('utf-8'))
, this urlopen call works fine. When it is over that limit, I get an HTTP 500
error.
Is this purely a server side error limitation on sizing? What is unusual is that when the large data is passed through a GUI to the endpoint, it works fine. Or is there something I can do to chunk my data to sub 64k bytes on the urlopen? Are there industry standards for handling this?