As i am new to python and scrapy. I have been trying to scrape a website which is URL-fragmented. I am making a post request to get the response but unfortunately its not getting me the result.
def start_requests(self):
try:
form = {'menu': '6'
, 'browseby': '8'
, 'sortby': '2'
, 'media': '3'
, 'ce_id': '1428'
, 'ot_id': '19999'
, 'marker': '354'
, 'getpage': '1'}
head = {
'Content-Type': 'application/x-www-form-urlencoded; charset=UTF-8',
# 'Content-Length': '78',
# 'Host': 'onlinelibrary.ectrims-congress.eu',
# 'Accept-Encoding': 'gzip, deflate, br',
# 'Connection': 'keep-alive',
'XMLHttpRequest':'XMLHttpRequest',
}
urls = [
'https://onlinelibrary.ectrims-congress.eu/ectrims/listing/conferences'
]
request_body = urllib.parse.urlencode(form)
print(request_body)
print(type(request_body))
for url in urls:
req = Request(url=url, body= request_body, method='POST', headers=head,callback=self.parse)
req.headers['Cookie'] = 'js_enabled=true; is_cookie_active=true;'
yield req
except Exception as e:
print('the error is {}'.format(e))
i am getting a constant error
[scrapy.downloadermiddlewares.retry] ERROR: Gave up retrying <POST https://onlinelibrary.ectrims-congress.eu/ectrims/listing/conferences> (failed 4 times): 400 Bad Request
When i tried to postman to check the same , I am getting the expected output. Can somebody help me with this.