I'm using scrapy to crawl data.While crawling data after a short period memory error occurs.
OSError: [Errno 12] Cannot allocate memory
What could be the reason for it.
Asked
Active
Viewed 1,197 times
0

cici alex
- 1
- 1
-
Possible duplicate of [Python subprocess.Popen "OSError: \[Errno 12\] Cannot allocate memory"](http://stackoverflow.com/questions/1367373/python-subprocess-popen-oserror-errno-12-cannot-allocate-memory) – S.Spieker Mar 16 '16 at 07:52
2 Answers
0
This may be occuring because you are running out of memory. Try into increase swap memory using the following commands.
sudo dd if=/dev/zero of=/swapfile bs=1024 count=1024k
sudo mkswap /swapfile
sudo swapon /swapfile
Then open fstab
sudo nano /etc/fstab
Then add the following line to make the swap change permanent.
/swapfile none swap sw 0 0
Reference link

qwertyui90
- 191
- 2
- 14
-
Could you show us your spider please. It's possible that restructuring your soider will use less memory. For example, if you read an entire csv file at the start of the scrape that could chew up a lot of memory. Or queuing a lot of `Request`s at the start of the run. – Steve Mar 16 '16 at 09:10
0
You can specify also how many memory to use on the scrapy project settings.py
file:
MEMUSAGE_ENABLED = True
MEMUSAGE_LIMIT_MB = 1024
I think scrapy uses 512
as default memory usage.

eLRuLL
- 18,488
- 9
- 73
- 99