Deployment is moving a website from a local environment to live servers. What seems like a simple thing can actually be quite complex. There are absolutely loads of ways to go about it. They range from user friendly software and services, to more complex command line tools, to full blown systems with lots of moving parts.
I’m not an expert in all this, but I’ve set up a few different deployment systems in my day as well as worked with a number of different ones. I thought we could try and round up and break down as much as we can here.
If you edit files right on your server directly, you essentially have no deployment step at all. You might accomplish this by using an FTP enabled editor like Coda or Adobe Dreamweaver. Enough people do this that popular hosts like Media Temple even have documentation on setting it up.
You might even using something like WebDAV to mount your remote server like a hard drive on you local computer, and use whatever local editor you like. Transmit has a “Disks” feature to makes this very easy.
The appeal here is strong because you get results live so quickly. However it is generally frowned upon. Because:
- It’s dangerous, people see mistakes immediately
- There is no testing
- There is no record of changes – what/who/when
- You can’t revert changes
- You can overwrite other people’s work in teams
- You can’t work on large, long term projects without additional setup
One very home-grown deployment technique is to simply work locally and when you are ready, move the files to the live server with FTP software (like Transmit).
While this works, it can be tedious. Which files did I change? Oh, forget it, I’ll just move everything. Man this is slow.
This still has many of the same shortfalls as above. You still risk overwriting someone else’s work, you still can’t revert easily, etc.
Using FTP this way doesn’t automatically mean you aren’t using version control, but it’s likely. If you were, you’d be managing that totally independently and it would feel extra tedious. And still, nothing to prevent you from making a mistake (e.g. sending files up to FTP before you pulled from the repo). If you are using version control, you’re likely already seeking a way to tie that into deployment.
Version control is important, and will play a role in every other deployment thing we talk about. But in itself, version control doesn’t automatically handle deployment for you. Without going into too much detail, projects are kept in repositories (or “repos”). They can have multiple contributors. Files are kept in sync. It is the contributors job to make sure they have the latest code and their new code fits. There is a record of all changes.
Version control software is just that, software. It likely doesn’t come standard on servers that you buy (even the “fully loaded” servers that often come with stuff like PHP automatically). But it is also likely that you can install it on those servers.
Then, just like you can
git push and
git pull from your local machine, you’ll be able to SSH into the server and
git pull from there, which will bring down the latest files from that repo onto the live server.
That’s simple deployment right there.
Jeremy Harris has an article that expands on this a bit, showing you how to push from you local machine and keep the .git directory out of the web root. Joe Maller has an article on using a “Hub” area on the server.
A Cron Job is a timed/automated task on a server. Run X every 2 minutes. I’ve seen folks run a Cron Job that git pulls every few minutes and that is their deployment strategy.
GitHub has a feature they call Post-Receive Hooks in which they will POST to a URL of your choice when a repo has been pushed to. You could use that POST request to run a script which will run the commands you need for deployment, for instance
NetTuts has an article “The Perfect Workflow” that explains how this could be set up. They use a PHP file that runs the SSH command though, which in my experience, many hosts lock down without even an option to allow.
Bitbucket, another version control host, also has POST Service Management.
There are some web apps out there that have made their business helping with deployment.
A fancy way of saying: tools that work by typing commands into the terminal. They typically have configuration that you either pass in the commands themselves, uses a config file, or both.
Capistrano isn’t strictly for deployment. It is for executing commands on servers. Deployment is a super common use for it though, and it was born from that origin:
Capistrano was originally designed to simplify and automate deployment of web applications to distributed environments, and originally came bundled with a set of tasks designed for deploying Rails applications.
It’s in Ruby, but it can be used for anything.
As a website gets more complex, simply pulling code from a central repo probably isn’t going to cut it. For instance, there might be multiple servers involved. Or pulling those files might take a little time and that in-between time could break your app. With Capistrano, you could configure it to prepare the server, pull the new files to new location, update the symlink to point to the latest, clean up permissions, restart services, and do it all on multiple servers. This image demonstrates that.
There was a Web UI for it, but it seems a bit out of date.
RailsCasts has a bunch of screencasts to learn about Capistrano.
rsync is exclusively about file transfer.
rsync is a file transfer program for Unix systems. rsync uses the “rsync algorithm” which provides a very fast method for bringing remote files into sync. It does this by sending just the differences in the files across the link, without requiring that both sets of files are present at one of the ends of the link beforehand.
Perhaps you’re sensing a running theme:
Instead of transfering the whole project, I thought, why not only transfer the files that changed since the last time, git can tell me those files.
git-ftp pushes files up to your server just like any FTP client would, but it knows exactly what files to send up because it uses Git, which knows.
git ftp push -u <user> -p - ftp://host.example.com/public_html
Dandelion is similar to git-ftp, but it works from config files so you can be a bit more specific about what you want to happen and simply the command (
dandelion deploy). It can also push to AWS.
Ansible configures operating systems, deploys applications, runs parallel commands, and orchestrates IT processes like zero-downtime rolling updates. It uses SSH by default, so no special software has to be installed to start managing remote machines. Modules can be written in any language.
One such pages is grunt-ftp-deploy which moves files from your local machine to a server over FTP. You configure the task then simply run it however you prefer to run Grunt tasks. It’s a rather “dumb” task in that it doesn’t even try to only move what has changed or reference version control. It just moves everything.
A “static” site is a site that requires no services or database or anything. Just a web server and a bunch of resources files (e.g. .html, .css, .js, images).
GitHub offers a service called GitHub Pages which will happily serve up static sites for you. You can even use your own domain name with them. You simply make a branch of a repo named gh-pages and it just kinda works. That way deployment is simply pushing to that branch of that repo.
There is a grunt task just for this.
Just because a site is static doesn’t mean it’s lame/simple/has poor architecture. There are build tools for static sites that allow you to smoosh together templates and content and spit out a website. Jekyll is one of those that is specifically built to work with GitHub Pages.
Octopress sits on top of Jekyll, providing configuration, templates, and such so you can get started faster.
Neither Jekyll or Octopress help with deployment necessarily, they are just very related to GitHub Pages which is a form of deployment.
There are even a set of command line tools called s3cmd which you can configure and run commands in which to deploy static sites (i.e.
Hopefully it’s clear now that just deployment can be complicated. The entire web platform stack can be super complicated. It’s no wonder that companies have stepped up and now offer services for simplifying it. These companies offer hosting and server management and databases and all that stuff. Easy deployment is typically part of the package.
In another vein, Mixture.io does hosting/deployment right from their desktop development tool.
Continuous integration (CI) is the practice, in software engineering, of merging all developer working copies with a shared mainline several times a day.
The idea is that individual developers don’t have a branch checked out so long that it becomes very different from the main repo and merging the two becomes difficult.
Extending this concept to the server means running that code on actual servers to make sure everything is OK. Run the build – does it pass? Run your tests – do they pass? Doing this often means catching problems early and huge deep problems never develop.
This is related to deployment because people use them to automatically deploy code when all steps are passing. For instance:
- Commit/push new code to repo
- CI tool runs all builds/tests that you configure
- If stuff passes, it gets deployed
- If stuff doesn’t pass, you are notified and no deployment happens
I actually barely understand all this, so please correct me in the comments if I have it all wrong. There are a number of different ones, so rather than try and explain something I can’t do very well, I’ll just list them:
- Travis CI (naturally works with GitHub)
- Jenkins CI (can be made to work with GitHub)
CMS’s have such huge communities these days it’s no surprise tools pop up that are specific to them and not just their parent language.
- WP-Stack is a boilerplate for WordPress sites, assuming Git and Capistrano.
- WordPress-Starter is similar but includes S3 backups
- We didn’t talk much about database deployment. I’m not sure there is much to say about it. I’ve always been surprised at how awkward it is moving databases around. WP Migrate DB Pro is a good tool specific to WordPress to keep them in sync.
Feel free to chime in in the comments with any additional information, things I got wrong, things I missed, or how you go about deployment.