0

I'm using scrapy to parse a web site

I can generate a list of the interested link in the page using this code:

response.xpath('//a[@class="button-border"]/@href').extract()

However it returns a list with relative links, how can i create the absolute links, crawl all of them and apply another set of rules to every link?

GGA
  • 385
  • 5
  • 22
  • i watched that thread before posting the question but i can't join the url and then pass a method for each url – GGA Jan 23 '16 at 21:02
  • Use a list comprehension: `[urljoin(base_url, url) for url in response.xpath('//a[@class="button-border"]/@href').extract()]` where `base_url = response.url`. – alecxe Jan 23 '16 at 21:05
  • it gives TypeError: unhashable type: 'list' – GGA Jan 23 '16 at 21:14
  • `.extract()` returns a list, please debug your code first. – eLRuLL Jan 23 '16 at 23:39

0 Answers0