Django and VirtualEnv Development/Deployment Best Practices

Question:

Just curious how people are deploying their Django projects in combination with virtualenv

  • More specifically, how do you keep your production virtualenv’s synched correctly with your development machine?

I use git for scm but I don’t have my virtualenv inside the git repo – should I, or is it best to use the pip freeze and then re-create the environment on the server using the freeze output? (If you do this, could you please describe the steps – I am finding very little good documentation on the unfreezing process – is something like pip install -r freeze_output.txt possible?)

Asked By: edub

||

Answers:

I use this bootstrap.py: http://github.com/ccnmtl/ccnmtldjango/blob/master/ccnmtldjango/template/bootstrap.py

which expects are directory called ‘requirements’ that looks something like this: http://github.com/ccnmtl/ccnmtldjango/tree/master/ccnmtldjango/template/requirements/

There’s an apps.txt, a libs.txt (which apps.txt includes–I just like to keep django apps seperate from other python modules) and a src directory which contains the actual tarballs.

When ./bootstrap.py is run, it creates the virtualenv (wiping a previous one if it exists) and installs everything from requirements/apps.txt into it. I do not ever install anything into the virtualenv otherwise. If I want to include a new library, I put the tarball into requirements/src/, add a line to one of the textfiles and re-run ./bootstrap.py.

bootstrap.py and requirements get checked into version control (also a copy of pip.py so I don’t even have to have that installed system-wide anywhere). The virtualenv itself isn’t. The scripts that I have that push out to production run ./bootstrap.py on the production server each time I push. (bootstrap.py also goes to some lengths to ensure that it’s sticking to Python 2.5 since that’s what we have on the production servers (Ubuntu Hardy) and my dev machine (Ubuntu Karmic) defaults to Python 2.6 if you’re not careful)

Answered By: thraxil

I just set something like this up at work using pip, Fabric and git. The flow is basically like this, and borrows heavily from this script:

  1. In our source tree, we maintain a requirements.txt file. We’ll maintain this manually.
  2. When we do a new release, the Fabric script creates an archive based on whatever treeish we pass it.
  3. Fabric will find the SHA for what we’re deploying with git log -1 --format=format:%h TREEISH. That gives us SHA_OF_THE_RELEASE
  4. Fabric will get the last SHA for our requirements file with git log -1 --format=format:%h SHA_OF_THE_RELEASE requirements.txt. This spits out the short version of the hash, like 1d02afc which is the SHA for that file for this particular release.
  5. The Fabric script will then look into a directory where our virtualenvs are stored on the remote host(s).
    1. If there is not a directory named 1d02afc, a new virtualenv is created and setup with pip install -E /path/to/venv/1d02afc -r /path/to/requirements.txt
    2. If there is an existing path/to/venv/1d02afc, nothing is done

The little magic part of this is passing whatever tree-ish you want to git, and having it do the packaging (from Fabric). By using git archive my-branch, git archive 1d02afc or whatever else, I’m guaranteed to get the right packages installed on my remote machines.

I went this route since I really didn’t want to have extra virtuenvs floating around if the packages hadn’t changed between release. I also don’t like the idea of having the actual packages I depend on in my own source tree.

Answered By: brianz
Categories: questions Tags: , , ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.