I have a Scrapy spider written in python 3 and I want to run it as a cron job on my cloud Linux server (I have root access)
first, I couldn't install using pip3 install scrapy
, I faced :
Exception:
Traceback (most recent call last):
File "/usr/local/lib/python3.4/tarfile.py", line 1642, in bz2open
import bz2
File "/usr/local/lib/python3.4/bz2.py", line 20, in <module>
from _bz2 import BZ2Compressor, BZ2Decompressor
ImportError: No module named '_bz2'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.4/site-packages/pip/basecommand.py", line 122, in main
status = self.run(options, args)
File "/usr/local/lib/python3.4/site-packages/pip/commands/install.py", line 278, in run
requirement_set.prepare_files(finder, force_root_egg_info=self.bundle, bundle=self.bundle)
File "/usr/local/lib/python3.4/site-packages/pip/req.py", line 1197, in prepare_files
do_download,
File "/usr/local/lib/python3.4/site-packages/pip/req.py", line 1375, in unpack_url
self.session,
File "/usr/local/lib/python3.4/site-packages/pip/download.py", line 582, in unpack_http_url
unpack_file(temp_location, location, content_type, link)
File "/usr/local/lib/python3.4/site-packages/pip/util.py", line 625, in unpack_file
untar_file(filename, location)
File "/usr/local/lib/python3.4/site-packages/pip/util.py", line 543, in untar_file
tar = tarfile.open(filename, mode)
File "/usr/local/lib/python3.4/tarfile.py", line 1567, in open
return func(name, filemode, fileobj, **kwargs)
File "/usr/local/lib/python3.4/tarfile.py", line 1644, in bz2open
raise CompressionError("bz2 module is not available")
tarfile.CompressionError: bz2 module is not available
then how can I run it as a cron job ?