On AWS Elastic Beanstalk, I have some Python packages installed in a directory that is not part of the standard Python packages path (either for 2.6.x or for the 2.7.x version used by the Elastic Beanstalk environment). As a result, these packages are not (by default) visible to the AWS-EB deployment processes when it installs packages listed in requirements.txt
, which can result in redundant packages being installed, often at the cost of (very) long deployment times.
Is there a way to make the directory where my packages are installed visible to the deployment process?
Conceptually, since (I assume) requirements.txt
processing occurs in my application's virtual environment (does it?) I could
echo 'export PYTHONPATH="/anaconda/lib/python2.7/site-packages"' >> /opt/python/run/venv/bin/activate
at some stage before requirements.txt
is processed and (for tidiness)
sed -i '/^export PYTHONPATH/d' /opt/python/run/venv/bin/activate
when it deactivates. But it isn't clear to me that this would happen at the right point in deployment. And anyway, it doesn't work because of permissions issues (I'm denied when I eb ssh
and as a container_commands
these have no effect). Perhaps something like this though is on the right path; are there places I could "hook" similar commands? (In any case it illustrates roughly what I'm trying to do.)