Edit: Updated build steps to avoid Jenkins thinking a failed build passed.
I recently started a new private repository for a project I'm working on, and as per prior posts, I started the project on Bitbucket which limits number of users accessing private repositories, rather than the number of private repositories (a la Github). Having evaluated CircleCI's pricing, and awaited the TravisCI for private repos, I realised as a spare-time developer, I can't really justify the expense of the private repository offerings from both providers.
This left me looking for solutions for automated continuous integration for my private Django projects, the next choice being Jenkins, the open source continuous integration solution. What was left was how to achieve the convenience of push-activated builds, and new virtualenv environments per build (as per Travis-CI's Github integration), whilst still keeping my projects private?
The integration between Bitbucket and Jenkins has been covered with the Jenkins service - Atlassian docs, and Felix Leong's blog. However, there are a few things to be aware of when configuring your Jenkins job. For Django-Jenkins integration I use the django-jenkins app, something I covered a while back.
To achieve the fresh virtualenv-per-build, first you need to give a set of build steps to Jenkins, which I've split into several steps
pip install --use-mirrors -q -r requirements.txt
python manage.py jenkins
You can then do the final cleanup using the [Post Build Script plugin, which I have detailed here
The $BUILD_TAG Jenkins environment variable is documented here. This creates a clean virtualenv, and installs your dependencies into it for each build. Like TravisCI, however, this can make builds agonisingly slow because a fresh copy of all the dependencies must be downloaded from a PyPI mirror, even if nothing has changed in the dependencies since the last build. My first builds took in excess of 4 minutes, despite the tests themselves taking less than a second to complete, something I felt was a little unsustainable. What is needed, is caching.
To solve this, we leverage the local caching built-in to pip, either via the --download-cache argument at installation time:
pip install --download-cache=/path/to/jenkins/pip-cache -r requirements.txt
or using the PIP_DOWNLOAD_CACHE environment variable, which I was led to by second at StackOverflow.
Using EnvInject Plugin for Jenkins you can add this evironment variable to your Jenkins job config (with the above line).
Now, the builds take very little time to complete, with local caches of the PyPI packages stored between builds. This keeps the clean build environment for each new job, whilst maintaining the speed of not changing virtualenv between builds.