I want to crawl a website which supports only post data. I want to send the query params in post data in all the requests. How to achieve this?
Asked
Active
Viewed 8,333 times
1 Answers
15
POST requests can be made using scrapy's Request or FormRequest classes.
Also, consider using start_requests()
method instead of start_urls
property.
Example:
from scrapy.http import FormRequest
class myspiderSpider(Spider):
name = "myspider"
allowed_domains = ["www.example.com"]
def start_requests(self):
return [ FormRequest("http://www.example.com/login",
formdata={'someparam': 'foo', 'otherparam': 'bar'},
callback=self.parse) ]
Hope that helps.

KrisWebDev
- 9,342
- 4
- 39
- 59

alecxe
- 462,703
- 120
- 1,088
- 1,195
-
1But which parameter of `Request` or `FormRequest` should I past the post data to? `body` or `formdata`? – Friedmannn Mar 20 '14 at 03:42
-
You should use the `formdata` dict for the post data. – lgaggini May 07 '14 at 14:50
-
How do you do it with a list of urls? – CodeGuru Sep 17 '17 at 13:34