I'm building a personal webscraper and I'm still in dev phase, but I want to start saving data. The problem is, I cannot PUT or POST from the notebook level, which also means I cannot iterate through a big list of dictionaries/json objects.
I can however to it manually via Postman by just pasting the body and sending it.
Below is my code, which currently returns:
The pastebin URL is:{"message": "Could not parse request body into json: Unrecognized token 'ID': was expecting 'null', 'true', 'false' or NaN\n at [Source: (byte[])"ID=0&District=London"; line: 1, column: 4]"}
import requests
# defining the api-endpoint
API_ENDPOINT = ##############
# data to be sent to api
body = {
"ID": 0,
"District": "London"
}
# sending post request and saving response as response object
r = requests.post(url = API_ENDPOINT, data = body)
# extracting response text
pastebin_url = r.text
print("The pastebin URL is:%s"%pastebin_url)
Related question - can I use urllib instead of requests here?