Deploying a Django Site with Fabric

There are a bunch of great articles out there to help people get a Fabric deployment script written for their Django sites. The following has a couple examples of how to use fabric commands to write a git control workflow helper, and a deployment script that pairs nicely with my Setting up an AWS ec2 instance for Nginx, Django, uWSGI, and MySQL article.

get into your virtual environment

This article makes some assumptions that you are using a similar server setup to this, which means you are using virtualenv.

$ source path/to/your/venv/bin/activate

Install Fabric

see the full fabric documentation

$ pip install fabric

and then update your requirements.txt file

$ pip freeze > requirements.txt

The Fabric script

Create a script called fabfile.py and put it somewhere in your project. I generally toss it in the root directory with the requirements.txt file.

In that file import some of the more basic fabric modules:

from fabric.api import *
from fabric.colors import green, red

####A simple git workflow helper example

Next, I like to create a Fabric command to handle the most basic of git workflows. Add the following function to your fabfile.py.

def build_commit(warn_only=True):
    """Build a commit"""
    local_branch = prompt("checkout branch: ")
    rebase_branch = prompt("rebase branch: ")

    local('git checkout %s' % local_branch)
    local('git add .')
    local('git add -u .')

    message  = prompt("commit message: ")

    local('git commit -m "%s"' % message)
    local('git checkout %s' % rebase_branch)
    local('git pull origin %s' % rebase_branch)
    local('git checkout %s' % local_branch)
    local('git rebase %s' % rebase_branch)
    local('git checkout %s' % rebase_branch)
    local('git merge %s' % local_branch)
    local('git push origin %s' % rebase_branch)
    local('git checkout %s' % local_branch)

This assumes a simple git workflow, where you are merging local feature branches into remote branches. Line by line:

  1. prompt user for the name of the local feature branch you are developing
  2. prompt user for the name of the remote branch you want to push your changes to
  3. checkout your local feature branch
  4. git track add new and modified files
  5. prompt user for a commit message
  6. run the git commit command with the specified commit message
  7. checks out the local copy of the remote branch
  8. does a git pull to pull any changes from that remote branch
  9. checks out the local feature branch
  10. rebases the local feature branch off the recently updated local copy of the remote branch
  11. checks out the local copy of the remote branch
  12. merges the work from your recently rebased local feature branch into the local copy of the remote branch
  13. pushes the local copy of the remote branch to origin
  14. checks out the local feature branch so that work can continue

running this command is as simple as

$ fab build_commit

####Write a Simple Deploy Script

Next we need to be able to get our code out to our server, run database migrations, collect static files and restart application server processes. Create another fabric function for this:

def server() :
    """This pushes to the EC2 instance defined below"""
    # The Elastic IP to your server
    env.host_string = '999.999.999.999'
    # your user on that system
    env.user = 'ubuntu' 
    # Assumes that your *.pem key is in the same directory as your fabfile.py
    env.key_filename = 'my_ec2_security_group.pem'

The above function handles the connection information, you will chain this method to the environment deployment script in the next step.

def staging() :
    # path to the directory on the server where your vhost is set up
    path = "/home/ubuntu/path-to-project"
    # name of the application process
    process = "staging"

    print(red("Beginning Deploy:"))
    with cd("%s/app" % path) :
        run("pwd")
        print(green("Pulling master from GitHub..."))
        run("git pull origin master")
        print(green("Installing requirements..."))
        run("source %s/venv/bin/activate && pip install -r requirements.txt" % path)
        print(green("Collecting static files..."))
        run("source %s/venv/bin/activate && python manage.py collectstatic --noinput" % path)
        print(green("Syncing the database..."))
        run("source %s/venv/bin/activate && python manage.py syncdb" % path)
        print(green("Migrating the database..."))
        run("source %s/venv/bin/activate && python manage.py migrate" % path)
        print(green("Restart the uwsgi process"))
        run("sudo service %s restart" % process)
    print(red("DONE!"))

The above command switches to the vhost directory, installs requirements, collects static files, syncs and migrates the database, and then restarts the web application process.

run the command by typing the following into the terminal:

$ fab server staging

You could create another function called production that runs additional commands, uses a different vhost directory, and restarts a different service for your production environment. Or if you had that same staging vhost set up on another server, you could write a server_b method that has different connection information and then run $ fab server_b staging… Fabric is pretty flexible for these kind of tasks.

Things to note:

  • Each line of the command runs without knowledge of the previous step.
  • if any line fails the script will stop.
  • the staging function uses run command the build_commit uses local. local runs the commands locally, and run uses the configured env from the server command. Therefore, running the staging command by itself will fail it MUST be chained off of the server command.
Written on March 9, 2013