Deployment is moving a website from a local environment to live servers. What seems like a simple thing can actually be quite complex. There are absolutely loads of ways to go about it. They range from user friendly software and services, to more complex command line tools, to full blown systems with lots of moving parts.
I’m not an expert in all this, but I’ve set up a few different deployment systems in my day as well as worked with a number of different ones. I thought we could try and round up and break down as much as we can here.
If you edit files right on your server directly, you essentially have no deployment step at all. You might accomplish this by using an FTP enabled editor like Coda or Adobe Dreamweaver. Enough people do this that popular hosts like Media Temple even have documentation on setting it up.
You might even using something like WebDAV to mount your remote server like a hard drive on you local computer, and use whatever local editor you like. Transmit has a “Disks” feature to makes this very easy.
The appeal here is strong because you get results live so quickly. However it is generally frowned upon. Because:
- It’s dangerous, people see mistakes immediately
- There is no testing
- There is no record of changes – what/who/when
- You can’t revert changes
- You can overwrite other people’s work in teams
- You can’t work on large, long term projects without additional setup
Using FTP Manually
One very home-grown deployment technique is to simply work locally and when you are ready, move the files to the live server with FTP software (like Transmit).
While this works, it can be tedious. Which files did I change? Oh, forget it, I’ll just move everything. Man this is slow.
This still has many of the same shortfalls as above. You still risk overwriting someone else’s work, you still can’t revert easily, etc.
Using FTP this way doesn’t automatically mean you aren’t using version control, but it’s likely. If you were, you’d be managing that totally independently and it would feel extra tedious. And still, nothing to prevent you from making a mistake (e.g. sending files up to FTP before you pulled from the repo). If you are using version control, you’re likely already seeking a way to tie that into deployment.
The Version Control Piece
Version control is important, and will play a role in every other deployment thing we talk about. But in itself, version control doesn’t automatically handle deployment for you. Without going into too much detail, projects are kept in repositories (or “repos”). They can have multiple contributors. Files are kept in sync. It is the contributors job to make sure they have the latest code and their new code fits. There is a record of all changes.
Git is probably the most common version control tool, but be aware that Subversion (“SVN”) and Mercurial are similar and fill the same role.
Install Version Control on the Server, Pull From There
Version control software is just that, software. It likely doesn’t come standard on servers that you buy (even the “fully loaded” servers that often come with stuff like PHP automatically). But it is also likely that you can install it on those servers.
Then, just like you can
git push and
git pull from your local machine, you’ll be able to SSH into the server and
git pull from there, which will bring down the latest files from that repo onto the live server.
That’s simple deployment right there.
Jeremy Harris has an article that expands on this a bit, showing you how to push from you local machine and keep the .git directory out of the web root. Joe Maller has an article on using a “Hub” area on the server.
A Cron Job is a timed/automated task on a server. Run X every 2 minutes. I’ve seen folks run a Cron Job that git pulls every few minutes and that is their deployment strategy.
Post-Receive Hooks / Webhooks
GitHub has a feature they call Post-Receive Hooks in which they will POST to a URL of your choice when a repo has been pushed to. You could use that POST request to run a script which will run the commands you need for deployment, for instance
NetTuts has an article “The Perfect Workflow” that explains how this could be set up. They use a PHP file that runs the SSH command though, which in my experience, many hosts lock down without even an option to allow.
Bitbucket, another version control host, also has POST Service Management.
Third Party Deployment Web Services
There are some web apps out there that have made their business helping with deployment.
Command Line Interface (CLI) Tools
A fancy way of saying: tools that work by typing commands into the terminal. They typically have configuration that you either pass in the commands themselves, uses a config file, or both.
Capistrano isn’t strictly for deployment. It is for executing commands on servers. Deployment is a super common use for it though, and it was born from that origin:
Capistrano was originally designed to simplify and automate deployment of web applications to distributed environments, and originally came bundled with a set of tasks designed for deploying Rails applications.
It’s in Ruby, but it can be used for anything.
As a website gets more complex, simply pulling code from a central repo probably isn’t going to cut it. For instance, there might be multiple servers involved. Or pulling those files might take a little time and that in-between time could break your app. With Capistrano, you could configure it to prepare the server, pull the new files to new location, update the symlink to point to the latest, clean up permissions, restart services, and do it all on multiple servers. This image demonstrates that.
There was a Web UI for it, but it seems a bit out of date.
RailsCasts has a bunch of screencasts to learn about Capistrano.
rsync is exclusively about file transfer.
rsync is a file transfer program for Unix systems. rsync uses the “rsync algorithm” which provides a very fast method for bringing remote files into sync. It does this by sending just the differences in the files across the link, without requiring that both sets of files are present at one of the ends of the link beforehand.
rsync is a command, so it is often run by a task runner like Make, or more commonly Rake since it’s also Ruby.
Perhaps you’re sensing a running theme:
Instead of transfering the whole project, I thought, why not only transfer the files that changed since the last time, git can tell me those files.
git-ftp pushes files up to your server just like any FTP client would, but it knows exactly what files to send up because it uses Git, which knows.
git ftp push -u <user> -p - ftp://host.example.com/public_html
Dandelion is similar to git-ftp, but it works from config files so you can be a bit more specific about what you want to happen and simply the command (
dandelion deploy). It can also push to AWS.
Ansible configures operating systems, deploys applications, runs parallel commands, and orchestrates IT processes like zero-downtime rolling updates. It uses SSH by default, so no special software has to be installed to start managing remote machines. Modules can be written in any language.
As far as build tools / task runners go, Grunt is slugging to be king. It is in Node.js which has a package manager (NPM) of it’s own, similar to Ruby Gems.
One such pages is grunt-ftp-deploy which moves files from your local machine to a server over FTP. You configure the task then simply run it however you prefer to run Grunt tasks. It’s a rather “dumb” task in that it doesn’t even try to only move what has changed or reference version control. It just moves everything.
A “static” site is a site that requires no services or database or anything. Just a web server and a bunch of resources files (e.g. .html, .css, .js, images).
GitHub offers a service called GitHub Pages which will happily serve up static sites for you. You can even use your own domain name with them. You simply make a branch of a repo named gh-pages and it just kinda works. That way deployment is simply pushing to that branch of that repo.
There is a grunt task just for this.
Just because a site is static doesn’t mean it’s lame/simple/has poor architecture. There are build tools for static sites that allow you to smoosh together templates and content and spit out a website. Jekyll is one of those that is specifically built to work with GitHub Pages.
Octopress sits on top of Jekyll, providing configuration, templates, and such so you can get started faster.
Neither Jekyll or Octopress help with deployment necessarily, they are just very related to GitHub Pages which is a form of deployment.
Amazon has web services where you can actually run servers, but it also has S3 which is simply file storage. You can actually host a static website on S3 and even use your own domain.
There are even a set of command line tools called s3cmd which you can configure and run commands in which to deploy static sites (i.e.
Platform as a Service (PaaS)
Hopefully it’s clear now that just deployment can be complicated. The entire web platform stack can be super complicated. It’s no wonder that companies have stepped up and now offer services for simplifying it. These companies offer hosting and server management and databases and all that stuff. Easy deployment is typically part of the package.
There is tons of PaaS at the enterprise level. See SalesForce, RedHat, and IBM for the tip of the iceberg.
In another vein, Mixture.io does hosting/deployment right from their desktop development tool.
Continuous Integration Servers
Continuous integration (CI) is the practice, in software engineering, of merging all developer working copies with a shared mainline several times a day.
The idea is that individual developers don’t have a branch checked out so long that it becomes very different from the main repo and merging the two becomes difficult.
Extending this concept to the server means running that code on actual servers to make sure everything is OK. Run the build – does it pass? Run your tests – do they pass? Doing this often means catching problems early and huge deep problems never develop.
This is related to deployment because people use them to automatically deploy code when all steps are passing. For instance:
- Commit/push new code to repo
- CI tool runs all builds/tests that you configure
- If stuff passes, it gets deployed
- If stuff doesn’t pass, you are notified and no deployment happens
I actually barely understand all this, so please correct me in the comments if I have it all wrong. There are a number of different ones, so rather than try and explain something I can’t do very well, I’ll just list them:
- Travis CI (naturally works with GitHub)
- Jenkins CI (can be made to work with GitHub)
CMS’s have such huge communities these days it’s no surprise tools pop up that are specific to them and not just their parent language.
- WP-Stack is a boilerplate for WordPress sites, assuming Git and Capistrano.
- WordPress-Starter is similar but includes S3 backups
- We didn’t talk much about database deployment. I’m not sure there is much to say about it. I’ve always been surprised at how awkward it is moving databases around. WP Migrate DB Pro is a good tool specific to WordPress to keep them in sync.
Drupal has a command line interface called Drush which has a deploy script called Drush Deploy.
Is That All?
Nope, probably not. I didn’t even mention stuff like Puppet and Salt Stack, which I don’t even really understand.
Feel free to chime in in the comments with any additional information, things I got wrong, things I missed, or how you go about deployment.
We are using Continuous Integration in my current office. TeamCity to be more precise.
I also don’t know much about the whole CI thing. The way my manager likes to keep it is that noone is allowed to make any kind of branches, except for very particular and uncommon cases, so I guess this fits what you said about noone having the repository checked out for too long, but I dont like this because it removes a lot of the advantages of using CVS.
I wonder if other people working with CI are having this issue, or is just my case that the restrictions make it hard to work with.
I went on this adventure myself a bit back trying to look at it from a CLI perspective.
These 3 commands are the base to any file transfer from a terminal shell. It should be known that
rsyncis not the best at tracking files like Git can achieve by detecting when files are no longer present or changed from the original source.
wgetworks great, but for single retrieval scenarios and one way communication -not to mention you need to know the URI any time you need to execute it.
scpallows files to be copied to, from, or between different hosts. That means it works great at two way communication. It uses
sshfor data transfer and provides the same authentication and same level of security as
ssh. I don’t know about anyone else, but
scpand the rest of it sounds like a ton of typing compared to using a GUI, but we still can’t sync an entire project or an array of sub directories properly. This is where SSH and POST HOOKS work great when using Git for deployment.
Dennis, I think you should take a closer look at rsync. Not only can it detect deleted files, but it can detect changed files of identical size using MD5 hash. It’s very fast and is really the only thing needed (SSH needs to be installed of course). There is no need to do anything special. You can rsync down (server to local) to catch any “hot fixes” (often time our team fixes typos or other misc stuff directly via ftp), merge local, test local, then rsync up (deploy new version of site). It’s very fast and there is no need to maintain a repo on your production server (not a big deal, but I’d rather keep that thing as lean as possible). Your rsync commands are portable and you own them, easy to fine-tune or update. They work with OSX, debian, etc.
Please cut out any deployment strategies that require FTP. I shouldn’t need to explain why….
Please do explain why. Is there something bad about using FTP to auto-push version-controlled code?
As a side note, it’s kind of funny that FTP is glossed over in the article as if the vast majority of websites don’t use it (for good or bad).
I’m sorry, but that is such a weak argument: “don’t do this because I said so.”
YES, yes you do need to explain why, otherwise you come off looking like an arrogant so-and-so.
If you read carefully what Chris said, he’s not recommending FTP—just mentioning it as an option.
Awesome post. Love how you never skimp on the details.
Using Git, you can also push files directly to your server like you would to GitHub. A good tutorial on how to setup this via SSH is this one by Arlo Carreon.
That said, Capistrano is more suitable as your site scales up (think multiple servers), but pushing via git is great if you have just the one server to which you are deploying to.
GIT is very good at pushing to more sites at once. One remote can have several upstream URLs. All servers need a local repo with post-receive/checkout hooks though. I’ve one repo that pushes to 4 URLs (1 is Github, 3 are servers).
Thanks for the post Chris. This topic is definitely interesting to me because I love using GIT, but unfortunately a lot of places still use FTP for deployment. I know one way people tend to play it “safe” with going commando is to have separate dev and live servers. That still doesn’t solve the problem of tracking changes and reverting mistakes though.
I actually ended up going the GitHub pages route with my site. It works well and it is far easier for me then trying to keep my local copy and my server copy in sync.
I’d be interested to hear what you use for deployment on css-tricks.com. Dreamweaver used to have a rudimentary “check in/out” system and would keep track of which files you changed locally. Not as good as svn/git, but not bad for a small web design team.
CSS-Tricks.com is on Beanstalk. Someday I’d consider getting it over on GitHub since there is really no reason it needs to be private and then I get the issue tracker and such.
Cracking round-up Chris!
It also might be worth noting that GitHub pages is a free service which is a big win in my books. I did a small post on getting set up which might be help someone Deploying via GitHub for Free.
I never understand why people are so hard on an old but reliable protocol like http://FTP…sure it is old…but for people working on small business web sites with clients who do not have the money for hundreds of dollars a month for hosting on a dedicated server FTP may be the only option to get files from a local machine to a shared web server…laugh as you will but that is the reality for smaller web sites…Not everyone is working on a large web app…
Anyone editing directly on a live server is living dangerously for sure…that I agree with…
git clone / git pull – 5min in the worst possible case
ftp – 1-3 hours at least
For a protocol specialized in file transfer FTP is a waste of time whenever I want to send files.
Please check out https://deploybutton.com. I found it very easy to set up. It supports multiple configurations.
A even better solution is http://dploy.io/ as it seems like deploybutton has been abandoned.
Nice summary. But you are missing tools or environments that are specifically meant for deployment automation. A la LiveRebel, uDeploy, DeployIT, InRelease etc.
Pelican is another great static site (and/or blog) generator. I’ve recently made a heroku buildpack so it works great with Heroku.
Wow… this post couldn’t have come at a better time! I’ve been looking into WordPress deploys with Capistrano, but it doesn’t suit the servers we use as we’re bound by their structure, and it makes the symlinking a messy solution.
Rsync, however, will work quickly and easily, and is not opinionated!
This article outlines a simple method for running a rake task for WP deploys:
Minus the Sass and Compass stuff, which I would handle with Grunt, this is far simpler than setting up Capistrano!
This article is awesome for quick reference of major concerns in Rsync deployment:
Using Git, you can also push files directly to your server like you would to GitHub. I actually ended up going the GitHub pages route with my site. It works well and it is far easier for me then trying to keep my local copy and my server copy in sync.
We use JIRA,Jenkins, GIT and GITHUB.
we create a feature ticket or bug ticket in JIRA, then create a branch from our main line with that JIRA ticket number. We fix the bug or when the branch is feature complete we send a pull request.
That branch is then merged into the main line and the commit ref the JIRA number and the JIRA ticket is updated with a link to the commit relating to it.
Once the ticket is closed we move onto the next.
We have weekly releases, so then at the end of the week, we use Jenkins to push to a staging area.. we test, then we push to live. Then it all starts again.
Hotfixs are done on branched from staging and backported.
Its a nice easy to follow flow. Lots of checks and accountability.
I’d love to know if anyone has any good suggestions if you’re using sourcesafe with sourcegear….or just sourcegear, and don’t have the admin rights to the server so can’t be logging on to run a “get latest version” from the server
The best I ever managed was to do a “get latest version” to a new directory then FTP the whole thing….yes Eric, you heard I said FTP :)
I don’t see how anyone could live without version control these days. I can’t even remember all the times I’ve done changes that I thought worked out just fine, all the while it broke something that I did earlier. Not realizing an issue immediately is a big problem, but it’s fixable with version control.
Although I have to say that I am yet to find a solution for keeping the 3 machines I work on, in sync. I have A) the office , B) home office, C) laptop. Keeping Photoshop & Illustrator settings, Coda plugins, CodeKit sites/settings, testing servers in MAMP and all the databases for various WP sites in sync, is HARD work… Any tips for how to do all of that? :-)
I use Dropbox for syncing between computers. It’s actually very easy if you set it up correctly. I wrote an article about my setup a while back. Check it out if you would like… Dropbox Sync for Development
Leave a comment if you have any questions
Patrik, I think the solution for you called “Dropbox+hardlinks(aliases)” or any other similar service e.g. sugarsync ;)
@Chris Thanks, that’s a nice solution for the MySql part, but I also need to figure out how to sync Coda site settings, Coda plugins etc. I can probably symlink my way out of that issue too I guess – but it seems like such a time-consuming thing to get up and running
I’m seriously considering getting rid of the two iMacs and get 27″ cinema displays instead, and use nothing but the laptop. Though I’d have to fix the problem of always forgetting the laptop at home… :)
No one should be without some sort of version control.
I second Chris’ response: Dropbox is just about perfect for all the other stuff (art, fonts, settings) that we need to keep in sync between boxes. Technically, Git can work as well, but then repos get pretty big. Adobe prefs… not sure about that one. I am curious about Bittorrent Sync as a way to keep various directories synced.
Syncing databases, now, that’s another problem. Which I why I started building with Statamic: flat-file, no database. Soooo much easier to keep everything sync’d and version-controlled.
You didn’t mention the 2 that I use:
RPM/Yum (for our web applications) and Erica (for CouchDB CouchApps)
This is a variation on the “Going Commando” technique. When I am working on a second version of a site, sometimes I just setup a new virtual server record on the production server and give it a subdomain (for testing). Then, when I am ready to go live, I just edit the virtual server records on the server and reload apache. With this technique, I am not moving any files or creating new databases to go-live. I am just rearranging which virtual server is dev and which is prod. One caveat with wordpress sites is that you would need to configure the dev system to have a different site url. This is pretty easy to do by setting the WP_SITEURL constant in wp-config.
For WordPress sites, I’ve just started setting up local dev and deployment with DesktopServer from http://serverpress.com/products/desktopserver/
It’s built on XAMPP, and direct deploys without FTP. So it looks to be an easy process for folks who work solo. Any thoughts?
I’ve actually been using Dropbox to do my deploys. It’s super convenient and I can still use version control on top of it….
Basically, I have two folders setup in Dropbox, a dev folder and a production folder. The live site is served out of the production folder, and my dev machine serves out of the dev folder. Whenever I’m ready to push to production I use version control to sync the two folders and I’m done. If I ever need to make any small or simple tweaks to the production site, I just edit the contents of the folder and sync them to dev later.
May not be the most secure, but it’s nice to never have to ssh or remote into another machine…
Great post! Another nice way to deploy over a ssh connection is to use Fabric.
Another great tool that cuts out a lot of the risk factor is RapidWeaver. The themes it comes with are not very impressive, but there is a bunch of 3rd party developers that make a ton. It works similar to FTP in a since, but nothing like it either – you make and edit page locally and can preview how they’ll appear in any web browser you have installed. Then you upload when ready. It is Mac only though.
This is one of those topics that developers can debate ad infinitum. There are so many ways to deploy and every situation is unique. I appreciate the breadth of options you’ve presented here- as well as the comments from other folks about what works for them.
It took me awhile before I found a flow that worked for me- so if anyone is reading this and feeling overwhelmed at the choices and strong opinions, relax- you need to find something that works for you and your organization/clients/team/etc. It doesn’t have to be rocket science (but it could be if you want it to!)
For what it’s worth, my preference is pushing to Github/Bitbucket, then pulling from a staging server, then rsyncing to production. For media and whatnot that is outside of the repo, I rely on ssh and sftp on custom ports.
Check this, laravel deploy.
I sure there will be packages developed for other PHP frameworks.
For really basic prototypes our designers deploy folders through dropbox with site44.com
I’ve only used 4 of these.
The traditional FTP transfer. Hard way to deploy and update whenever you have a ton of updates done in various folders. I end up just re-uploading everything and overwriting everything.
And of course as a Rails developer, I have been git pulling-ing on the server or just using Capistrano. Used Heroku several times too.
For me, the best deployment was with git and Capistrano. I’ve never used any of those deployment services yet.
Heroku does a lot more than just Ruby on Rails; I wouldn’t even say they specialize as a RoR host, as evidenced here. I’ve been hosting Node apps with them and they’re fantastic. They officially support Java, Node.js, Python, Ruby, Scala, and Clojure and they have undocumented support for PHP.
Haven’t had a chance to play with it yet but http://www.phptesting.org/ look to have a pretty interesting PHP CI solution (in Beta at the moment).
Currently I use git flow (http://nvie.com/posts/a-successful-git-branching-model/) combined with some post commit / merge hooks for archiving and deployment but I really want to get Jira up and running along with a CI solution.
I need to rewrite the hooks but if anyone is interested you can find them on BitBucket
Do you know PAAS Openshift?
Thanks for your article which gave me a good many options to consider! But I don’t really sure about their services and performance…
I have a linode account to host my own blogs and just using FTP for files synchronization. Because the blogs are small and there’s only me who work on it. However, for my company, we have our own data center so we build ourselves a svn server to serve many websites.
I use and love http://modxcloud.com for my MODX CMS websites. Could be added to the list!
One step I believe is often overlooked when deploying a website that relies on some sort of CMS (wordpress, b2evolution, drupal, or your own…) is that deploying updated PHP and static files is not enough. You will frequently need to run a script to create/update the database or install a plugin, etc.
During all that time your site is just broken.
In my opinion: leaving a site broken during the time of the upgrade is just as bad as live editing your site with FTP.
So I would say one should focus on first putting their site into “maintenance mode” by displaying a “maintenance” page on every request and then put it back to normal operations once the whole setup is complete.
This can be done with a mod_rewrite in the .htaccess file for example. Each CMS may also have a specific way of handling that for you.
No Python love at all?? :)
Another commenter mentioned Fabric already (This is the Fabric boilerplate I use) , but additional Python packages make deployment a breeze:
Buildbot for continuous integration, automated builds/tests/etc
Salt for config management, release orchestration, pretty much anything you’d want to do. (Note: this is a much heavier package, ideal for large scale deployments)
for those who like a UI for git and svn
Some awesome services listed here chris, recently starting our own agency we are looking at various solutions for deployment, so taking a deeper look at whats on offer.
This is a kick ass roundup. Props for posting.
Ulrich posted above about Deploy Button. I wrote a blog post & did a screencast about how designers (me included) can use Git and Deploy Button (or similar service) for easy deployments:
I ended up doing this directly with my bluehost shared account, I push changes live to the main website having a git repository in a folder on my hosting
I’ll guess https://www.cloudcontrol.com/ is something similar that might be worth a mention here.
One of my favorite ways to deploy themes to WordPress was this plugin that deployed your github-hosted themes through WordPress own update feature. Too bad the project seems kind of abandoned.
Thank you!.. This is the best high-level explanation of “deployment” I’ve seen.
It’s easy for newbies to get intimidated by lingo like “deploying your app”, but your article nicely shows it’s really just a matter of HOW you copy your files from computer A to computer B.
Very nice post! I tried FTPloy but it was buggy. So I came up with my own solution:
I like to manage everything from a BitBucket repo (git, issues, users …).
FTP is a great way to copy files and is present on most servers. As soon as you do your own source code management.
I’ve written a little PHP script that helps with FTP deployments. It si called PHPloy and is tightly-coupled with Git to determine what which files where edited/added/deleted and uploads them accordingly. You put your FTP details in a deploy.ini file and run just a command to deploy:
You can also deploy to multiple servers at once. And if you have multiple servers configured, you can select to deploy to one of them like this:
There is more that can be done – check it out on Github: https://github.com/banago/PHPloy
dploy.io is now free for 1 private repository, plans start at $15. Give it a try!
I working in small company. I am the only developer. All sites in WordPress. I am using MAMP and localahost. Uploading files to subdomain through filezilla FTP show to client and after i need to download back to my localhost and upload to live server. is annoying! Dont know what to use, please help and how i can set up with steps.
dont understand how git and bitbucket works. how i can deploy from localhost to subdomain or live server.
I know that for database i have to use DBMigrate. And just import/export
ok so i using git and bitbucket. how to download my repository from bitbucket and install on my ftp server? ? something free ?
Give http://dploy.io a try, it’s free and does just that and more.
Dima, i seen this but there you have just one repository for free. when i have more websites in progress which i am working on i need more repositories to deploy through FTP. any other option?
like now i am working on 3 websites locally need them to deploy on 3 different subdomain.
thanks for help
Everybody forgets Joomla :(
Thanks for the article. For many users without PHP or Linux knowledge, using third party cron job service like easycron.com is a good choice.
Thanks so much for this!