I am working on a project and I have cloned a repository from github.
After first compile I realized that the project that I cloned has some dependencies and they were in requirements.txt
file.
I know I have to install these packages, but I dont want to cause I am on windows development environment and after finishing my project I am going to publish it to my ubuntu production environment and I dont want to take the hassle of double installation.
I have two options:
Using a
virtualenv
and installing those packages inside itDownloading the packages and use them the
direct
way usingimport foldername
I wanna avoid the first option cause I have less control over my project and the problem gets bigger and bigger If for example I were inside another project's virtualenv and wanted to run my project's main.py
file from its own virtualenv and etc... Also moving the virtualenv from windows (bat files) to linux (bash / sh files) seems ugly to me and directs me to approaches that I choose to better avoid.
The second option is my choice. for example I need to use the future
package. The scenario would be downloading the package using pip download future
and when done extracting the tar.gz
file, inside the src
folder I can see the future
package folder, And I use it with import future_package.src.future
without even touching anything else.
Aside from os.path
problems (which assume I take care of):
Is this good practice?
I am not running the setup.py
preventing any installation. Can it cause problems?
Is there any better approach that involves less work (like the second one) or the better one is my mentioned first approach?
UPDATE 1: I have extracted future
and certifi
packages which were part of the requirements
of my project and I used them the direct
way and it is working in this particular case.