I'm struggling to get OAuth2 authorisation to work on a script I'm working on when I run it on an EC2 Linux server instance (running Ubuntu 13.04). The relevant snippet is:
with open('creds.txt') as f:
creds = {}
for line in f:
creds[line.split(',')[0]] = line.split(',')[1].rstrip('\n')
self.client_id = creds['client_id']
self.client_secret = creds['client_secret']
self.username = creds['username']
self.password = creds['password'])
token_response = requests.post(
"https://example.com/oauth2/access_token/",
data={
"grant_type": "password",
"client_id": self.client_id,
"client_secret": self.client_secret,
"username": self.username,
"password": self.password,
"scope": "read+write"}).json()
It runs fine on my home computer (running Windows 7), just not when I try and run it remotely where I get: {u'error': u'invalid_client'}
.
I've tried setting up a new client ID and secret and still get the same response.
- Why does it work differently on a remote server to on my own machine?
Does it matter on which machine the application was created (see comment)?- I eliminated this possibility by successfully authenticating usingCURL
in both environments.
The only thing I can think of now is that perhaps the requests library handles the POST request differently on Ubuntu. Does anyone know if this is the case?