I am able to run a crawler localy which reads some input from a local file inside the scrapy project. Deployment with scrapyd-deploy failes, as the local file is somehow not in the package.
inside scrapy project, open a file:
with open('imports/filter.csv', newline='') as f:
for row in csv.reader(f):
setup.py:
from setuptools import setup, find_packages
setup(
name = 'test',
version = '1.0',
packages = find_packages(),
entry_points = {'scrapy': ['settings = crawler.settings']},
include_package_data = True,
package_data={'': ['imports/*.csv']}
)
The package_data and include_package_data somehow does not have any effect:
in GetbidSpider\nFileNotFoundError: [Errno 2] No such file or directory: 'imports/filter.csv'\n"}
How can I include files inside the project without using absolute paths?