I´m searching for a solution to make lots of async web requests without waiting for a answer.
Here is my current code:
import mechanize
from mechanize._opener import urlopen
from mechanize._form import ParseResponse
from multiprocessing import Pool
brow = mechanize.Browser()
brow.open('https://website.com')
#Login
brow.select_form(nr = 0)
brow.form['username'] = 'user'
brow.form['password'] = 'password'
brow.submit()
while(true):
#async open the browser until some state is fullfilled
brow.open('https://website.com/needthiswebsite')
The problem with the code above is that if I try to make two browser openings bro2 has to wait for bro1 to finish to start. (its blocking)
bro1.open('https://website.com/needthiswebsite')
bro2.open('https://website.com/needthiswebsite')
Attempt of a solution:
#PSUDO-CODE
#GLOBAL VARIABLE STATE
boolean state = true
while(state):
#async open the browser until some state is full filled
#I spam this function until I get a positive answer from one of the calls
pool = Pool(processes = 1)
result = pool.apply_async(openWebsite,[brow1],callback = updateState)
def openWebsite(browser):
result = browser.open('https://website.com/needthiswebsite')
if result.something() == WHATIWANT:
return true
return false
def updateState(state):
state = true
I was trying to implement a similar solution for my problem like the answer in: Asynchronous method call in Python? question on stackoverflow.
The problem with this is I get a error when trying to use pool.apply_async(brow.open())
ERROR MSG:
PicklingError: Can't pickle : attribute lookup builtin.function failed
I have tried lots of things to try to fix the PicklingError but nothing seems to work.
- Is it possible to do this with mechanize ?
- Should i change to another library like urllib2 or something like that?
Any help would be really appreciated:)