Git and WordPress Workflows

Published on

Using Git with WordPress has always been a little bit of black magic. You can’t keep the database under Git, nor can you run migrations on the database. We are left just putting our code and files under Git management, and leaving the database migrations up to something like WP Migrate DB.

Things get even trickier when you are working with a team. Let’s imagine we have a couple of developers working on either your WordPress theme, or some plugin that runs on your WordPress install. Any changes the developer needs to make to the database, needs to be synced with the live version when the plugin or theme changes go live. This is where the problems really start to mount.

If a user has created a new blog post or page, then anything you push up, will overwrite their new entries. With WP Migrate DB, you could specify to only sync certain tables, but what if you are both working on the same database table?

Decide Where Your Real Data Lives

You need to decide, if you want a local to live setup, or a live pull setup. The difference depends on how many people are working on your content. If you have lots of content editors, then it is best to only use the live database, and pull in changes using WP Migrate DB. This way, you can add content locally, try out how different pages or plugins work, but you know it will be overwritten once you pull in the live database.

If you are the only one working on your WordPress install, for example this blog, then you should work locally and push all your changes afterwards. The is hugely beneficial as you can make as many changes or errors, without having anything be live.

Move your uploads to S3

Hosting all your files on Amazon S3 makes things a lot easier for you. The Amazon S3 and Cloudfront plugin makes this increddibly easy for you to manage. Once you make this change, all the files you reference through the media manager, will be using S3 urls.

That means when you pull in the remote database, you will be working with the same files, even if you didn’t download the uploads directory. You also can get a speed boost by enabling Cloudfront support on your uploads, which gives you CDN speed and delivery of all your media attachments. You also won’t have to worry about backup for the most part. If the files are of a sensitive nature, I would still back them up anyway.

Despite What You’ve Heard, Don’t Exclude the core files

All the WordPress and git articles I’ve seen, have the their gitignore file excluding all the WordPress core files. I think this is a good practice when using a package systems, like NPM Node, Composer, or Bower, but breaks down with something like WordPress.

All those package systems come with a package file, like composer.json that tells you the exact version that needs to be installed. No such file exists with WordPress, so you have no knowledge of what version of WordPress this site is running from the home directory. If you keep the WordPress core files in your git repo, then you can see if a new point release, or major release was made since your last pull.

Conclusion, and further discoveries

We have this problem at Legend Boats, where we have multiple developers, content writers, editors, and managers. All parties that are trying to manage the content, all have different abilities and goals.

I saw a talk at jQueryTO last year, and got to see how the jQuery team manage their WordPress sites. They actually use a Vagrant box that they move around, then tag when they are ready to push live. I little more complicated then I want to get into, but worth noting that the best programming organizations struggle with this problem too. Below is a quick video demonstrating how we work with WordPress and try to keep everything managed.