2

I generally don't bother to install Python modules. I use web2py, and just dump them in the modules folder and let it take care of the local imports. It just always seemed like the most straightforward way of doing things- never felt right about handling dependencies at a system-wide level, and never felt like messing with virtual envs.

On one of my other questions, the answerer said

Generally, the best practice for 3rd party modules is to install them via pip or easy_install (preferably in a virtualenv), if they're available on PyPI, rather than copying them somewhere onto your PYTHONPATH. ... [because that] runs the install scripts hooks necessary to install executable scripts, build C extensions, etc., that isn't done by just copying in a module.

I don't fully understand this. I always thought it was more of a preference, but is it true that it's better practice to install 3rd party modules, and am I potentially causing problems by not doing that? Does using a framework like web2py make a difference?

Community
  • 1
  • 1
Yarin
  • 173,523
  • 149
  • 402
  • 512
  • 1
    Like the answerer said, C extensions won't get built, and any executable scripts won't be installed. Some modules also rely on `setup.py` to build certain templated source files, though that's less common. It will typically, but not always, be immediately obvious if it breaks. Virtualenvs really aren't that much work to handle, though, and are the "right way" to handle this. – Danica May 25 '12 at 21:25
  • How can you know if a module builds C extensions? – Yarin May 25 '12 at 21:27
  • 1
    I'm not sure if there's a great, generic way. You can look in the `setup.py` for `Extension` modules, or search for files with extensions other than `.py` and see if any are `.c`, `.cpp`, `.cxx`, `.C`, `.h`, or any of the myriad of other extensions that might be used in an extension file. Most general Python packages you're likely to use in a web project probably won't have one, but anything that's computationally intensive might, as would things based on interfacing to preexisting libraries. – Danica May 25 '12 at 21:34
  • Dougal- thanks, sounds like the safe play is more use of virtualenvs.. I'll get on it – Yarin May 25 '12 at 21:40

1 Answers1

1

It depends on the module and what you want to use it for. Some packages come with useful command line tools, which may only be available to you if you install them appropriately.

Conversely, if you're writing code which is to be distributed to environments you don't have much control over, you often have to keep a copy of the code locally within your project, as the target environment may not have the package... web projects often fall into this category, depending on your serving environment, of course.

Stu Cox
  • 4,486
  • 2
  • 25
  • 28
  • 3
    Of course, a virtualenv with a custom bootstrap script (write a requirements.txt and feed it to the virtualenv's pip using `subprocess`in the `after_install` hook) solves the same problem more reliably and has all the nice things the right way (tm) offers - automatization, dependency handling, easy upgrading, less prone to manual errors. It needs some network access but that seems practically a given in web dev ;) –  May 25 '12 at 21:32
  • 2
    Sandboxed environments like Google App Engine or some shared hosting often won't let you run things like a virtualenv bootstrap script. I agree it's a great way to work if you can though! – Stu Cox May 25 '12 at 21:37
  • Thanks guys- looks like I need to bone up on virtualenvs. – Yarin May 25 '12 at 21:49