At Pegasus News, we run custom deploy scripts that use pip to read through a requirements.txt file and keep our virtualenvs up to date. We use quite a few 3rd-party Django apps that we pull from PyPi, along with several apps - both internal and 3rd-party - from Github and Bitbucket. I'm a huge fan of virtualenv and pip and I love managing our environments with these tools. When we need to add a new requirement, we simply add it to our requirements.txt and on the next deploy pip will grab it for us on each of our web nodes.
We ran into a couple of issues along the way, however, and I found that a local PyPi server solved them quite nicely for us.
Stuck in Downtime Limbo
Before we started running a local PyPi server, when PyPi, Github, or Bitbucket went down we couldn't deploy - or worse yet we'd get stuck in the middle of a deploy and have to wait for the service to be restored before we could bring the site back out of maintenance mode. We were lucky that it only happened once and only left us hanging for about 10 minutes, but it could easily have been much worse. I knew that we had to take control of our package sources so that we could be responsible for their availability.
Installing From Github, Bitbucket, and .tar.gz is Slow
Even when the services are up, installing requirements directly from Github or Bitbucket using the -e option is very slow. For each repository, pip does a full checkout from scratch for the requested revision. For large repos, this takes quite a few seconds, and when you have a large list of requirements being installed on multiple web nodes for multiple sites, those seconds add up.
When using plain .tar.gz files for the packages, hosted from our lighttpd server, pip would have to download each archive, decompress, and then use the setup script to gather version info. This added up quickly as well.
We Can't Upload Internal Packages to PyPi
Our internal Django apps are not open-source, so we can't upload them to the general PyPi index. In addition, we have a few custom packages made from snippets or examples that don't have a clear open-source license.
When running through our requirements on PyPi, however, pip would go very fast. It would hit PyPi to grab version info, and then compare with what is already installed. A large list of requirements could be scanned by pip in no time! After looking around at the available options, I decided to go with Chishop because it is based on Django. The standard views are used to list and retrieve package information, and the admin views are used for manually administrating the projects and packages on your server. Most of the time, however, you can simply interact with your local PyPi server using the setup.py files in each app you work with.
Deploying With Gunicorn
I wanted to run Chishop on a completely separate process than our production Apache instance, but I didn't want to deal with all the overhead of running an additional, isolated Apache service. I've been hearing Django devs like Justin Lilly talk about their experiences with Gunicorn lately, so I decided to give it a try. After installing, I used the following Supervisord config block to get the process up and running:
[program:gunicorn] command=/path/to/chishop/bin/django run_gunicorn 220.127.116.119:8123 directory=/path/to/chishop/chishop user=myuser autostart=true autorestart=true redirect_stderr=true
After rereading my Supervisord config, Gunicorn was off and running and my Chishop instance was live.
Configuring My Local Machine
To configure my local machine to access our local PyPi server, I needed to create a ~/.pypirc file. If you don't already have a pypi username and password, you should be able to leave the values at the 'user' and 'secret' values below, and just never try to upload anything to PyPi itself.
:::text [distutils] index-servers = chishop pypi [pypi] username:user password:secret [chishop] username:user password:secret repository:http://my.chishop-server.com:8123
Packages uploaded to Chishop must have a functioning setup.py. Take a look at the one within Django itself for an example.
Chishop is organized into projects, which collect the metadata for a specific package, and releases, which are specific versions of a package. To upload a package to Chishop, use the following command:
:::bash $ python setup.py register -r chishop sdist upload -r chishop
This command registers the package metadata (from setup.py) to Chishop, and then creates a source distribution and uploads it through the command line to the server. If this command is successful, no further work is needed to make the package available. If something goes wrong, however, you may need to go into the Chishop admin section to manually create or alter the project & release.
To install a package from our Chishop server, we use pip with the -i option.
:::bash $ pip install ybrowserauth -i http://my.chishop-server:8123
In our deploy script, we use an additional option, --extra-index-url=http://pypi.python.org/simple. This allows us to fall back to PyPi if the requested package isn't found on our Chishop instance.
Using this setup, our deploys are now dramatically faster and more reliable. We have control over what packages are available, and we routinely create custom packages with specific Github revisions or modifications. I'd definitely recommend Chishop to others looking for a similar solution.