1

I tried to crawl the web, but it has to run the js to get the page full loaded. I used spynner.Browser in the middleware as follows. The problem is the web I'm trying to crawl needs cookie enabled.

How can I pass the cookies to spynner.Browser in scrapy?

scrapy.request -> spynner.Browser() -> scrapy.response

import spynner import pyquery
from scrapy.http import HtmlResponse

class WebkitDownloaderTest( object ):
    def process_request( self, request, spider ):
        browser = spynner.Browser()
        if 'Cookie' in request.headers.keys():
            browser.set_cookies(request.headers.Cookie) # is this correct?
        browser.create_webview()
        browser.set_html_parser(pyquery.PyQuery)
        browser.load(request.url, 20)
        try:
            browser.wait_load(10)
        except:
            pass
        string = browser.html
        string=string.encode('utf-8')
        renderedBody = str(string)
        browser.close()
        return HtmlResponse(request.url, 
            Cookies = browser.cookies, # is this correct?
            body=renderedBody )
EthanZhang
  • 23
  • 4

0 Answers0