0

I'm developing a client for some website,

when I use Chrome/Firefox to access the website, it writes some cookies in my local side, in addition to the Cookie field in HTTP response,

I need to extract those additional information from my local files to send a request which can be accepted by the remote server successfully

Can anyone tell me how to do it in Python?

Best,

CodingCat
  • 80
  • 7

1 Answers1

1

You have many options. The best one seems to be to use urllib2. Take a look at How to use Python to login to a webpage and retrieve cookies for later usage? for some excellent answers.

Here's the code from the top answer there. It's to log in, set some cookies, and access a restricted page:

import urllib, urllib2, cookielib

username = 'myuser'
password = 'mypassword'

cj = cookielib.CookieJar()
opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cj))
login_data = urllib.urlencode({'username' : username, 'j_password' : password})
opener.open('http://www.example.com/login.php', login_data)
resp = opener.open('http://www.example.com/hiddenpage.php')
print resp.read()
Community
  • 1
  • 1
amrav
  • 307
  • 1
  • 9
  • en, it seems that this method doesn't work for my case, after I access the "public page" to gain cookies information, I print out the information in cj, it only contains the information returned back with the HTTPResponse, however, I checked that with FireBug Plugin in FireFox, there are obviously some missing values, which are the reason for my requests to be blocked – CodingCat Dec 30 '12 at 00:26
  • Could you share the contents of cj and what values are missing? It looks like the "public page" might be redirecting you to a intermediate page first to set cookies, as is often done. In that case, you have to visit that intermediate page too, programmatically. – amrav Dec 30 '12 at 09:10
  • the cj shows , and the missing values are s_vnum s_vi – CodingCat Jan 01 '13 at 01:24