My CrawlSpider:
class FabulousFoxSpider(CrawlSpider):
"""docstring for EventsSpider"""
name="fabulousfox"
allowed_domains=["fabulousfox.com"]
start_urls=["http://www.fabulousfox.com"]
rules = (
Rule(SgmlLinkExtractor(
allow=(
'/shows_page_(single|multi).aspx\?usID=(\d)*'
),
unique=True),
'parse_fabulousfox',
),
)
But when I do scrapy crawl fabulousfox -o data.json -t json
i get the output as:
...................
......................
2014-03-01 13:11:56+0530 [scrapy] DEBUG: Telnet console listening on 0.0.0.0:6023
2014-03-01 13:11:56+0530 [scrapy] DEBUG: Web service listening on 0.0.0.0:6080
2014-03-01 13:11:57+0530 [fabulousfox] DEBUG: Crawled (200) <GET http://www.fabulousfox.com> (referer: None)
2014-03-01 13:11:57+0530 [fabulousfox] DEBUG: Crawled (403) <GET http://www.fabulousfox.com/../shows_page_multi.aspx?usID=365> (referer: http://www.fabulousfox.com)
2014-03-01 13:11:58+0530 [fabulousfox] DEBUG: Crawled (403) <GET http://www.fabulousfox.com/../shows_page_single.aspx?usID=389> (referer: http://www.fabulousfox.com)
2014-03-01 13:11:58+0530 [fabulousfox] DEBUG: Crawled (403) <GET http://www.fabulousfox.com/../shows_page_multi.aspx?usID=388> (referer: http://www.fabulousfox.com)
2014-03-01 13:11:58+0530 [fabulousfox] DEBUG: Crawled (403) <GET http://www.fabulousfox.com/../shows_page_single.aspx?usID=394> (referer: http://www.fabulousfox.com)
2014-03-01 13:11:58+0530 [fabulousfox] DEBUG: Crawled (403) <GET http://www.fabulousfox.com/../shows_page_multi.aspx?usID=358> (referer: http://www.fabulousfox.com)
2014-03-01 13:11:58+0530 [fabulousfox] INFO: Closing spider (finished)
2014-03-01 13:11:58+0530 [fabulousfox] INFO: Dumping Scrapy stats:
{'downloader/request_bytes': 1660,
'downloader/request_count': 6,
'downloader/request_method_count/GET': 6,
'downloader/response_bytes': 12840,
'downloader/response_count': 6,
'downloader/response_status_count/200': 1,
'downloader/response_status_count/403': 5,
'finish_reason': 'finished',
'finish_time': datetime.datetime(2014, 3, 1, 7, 41, 58, 218296),
'log_count/DEBUG': 8,
'log_count/INFO': 7,
'memdebug/gc_garbage_count': 0,
'memdebug/live_refs/FabulousFoxSpider': 1,
'memusage/max': 33275904,
'memusage/startup': 33275904,
'request_depth_max': 1,
'response_received_count': 6,
'scheduler/dequeued': 6,
'scheduler/dequeued/memory': 6,
'scheduler/enqueued': 6,
'scheduler/enqueued/memory': 6,
'start_time': datetime.datetime(2014, 3, 1, 7, 41, 56, 360266)}
2014-03-01 13:11:58+0530 [fabulousfox] INFO: Spider closed (finished)
Why the url's generated contain ...
http://www.fabulousfox.com/../shows_page_multi.aspx?usID=365
Also it's not generating all the url's. What's wrong in here?