MySQL Database Backups with Google Drive

Updated
Note: The following instructions were written for and tested on Ubuntu Server 16.04 but should also work on other versions/distros

When I set up the server for this website I wanted a simple but suitable method for backing up the database off-server. Seeing as I already use Google Drive it made sense to backup the database to there. The following tutorial explains the steps I took and what I came up with.

GDrive

First we need to download a copy of gdrive which is a command line utility to interact with Google Drive. Download the correct version for your system from the Github project’s readme. I needed to download gdrive-linux-x64 but yours may differ. Once it’s downloaded you’ll want to make the file executable, and move it somewhere it can be called from any directory.

Since the script and a lot of the commands we’re going to run need root privileges we’ll login as the root user now as well and assume all other commands in this tutorial are run as root:

sudo -i
chmod +x gdrive-linux-x64
mv gdrive-linux-x64 /usr/local/bin/gdrive

Now we can call the utility with gdrive instead of typing the full path every time. Before we can upload any files we need to link the utility with a Drive account by calling gdrive about:

gdrive about
Authentication needed
Go to the following url in your browser:
https://accounts.google.com/o/oauth2/auth?access_type=offline&client_id=...
Enter verification code:

Visit the link given in your terminal and select the relevant Google account if you’re signed in to more than one. Once you’re signed in you should be presented with something that looks like the following:

Google drive authorisation

Google drive authorisation

Click Allow and then copy/paste the given access code into your terminal:

Enter verification code: ...
User: John Smith, [email protected]
Used: 11.4 GB
Free: 4.7 GB
Total: 16.1 GB
Max upload size: 5.2 TB

To make sure gdrive is fully authorised to use your Drive account, type gdrive list at the terminal and you should be presented with a list of some of the files on your account. Once you’re sure the utility is authorised, we need to create a directory where we’ll be storing our backups:

gdrive mkdir db-dumps
Directory 0BwvPrE-VLdWoS2xva3BNYjhiXVE created

Make a note of the returned directory ID as we’ll need this when we write our backup script.

Backup Script

Now we’re all set up to push files to our Drive account, it’s time to actually write the backup script. Create a new file called db-dump and update its permissions so that it is executable:

touch db-dump
chmod +x db-dump

Edit the newly created file using your favourite text editor so it looks like this:

#!/bin/sh

# Bail out if there are any errors.
set -e

# The name of the database we're going to backup.
dbname="mydatabase_"

# The Google Drive folder ID where database exports will be uploaded to
gdrivefolderid="0BwvPrW-CLdQoS5xvc3BPYjgiMVE"

# Number of days we want to retain local backups for
retentiondays=14

# Date format for dates appended to database export files 
dateformat="%Y-%m-%d_%H:%M:%S"

# The local directory where we'll be storing database exports
dumpdir="/var/spool/db-dump"

# Options to pass to the mysqldump command
mysqlopts=""

# Make sure the directory exists
mkdir -p "$dumpdir"

# Delete local export files older than our retentiondays value
find "$dumpdir" -type f -name "*.sql.gz" -mtime +"$retentiondays" -print -exec rm "{}" \;

# Zip up any existing export files
find "$dumpdir" -type f -name "*.sql" -print -exec gzip "{}" \;

# Perform a backup of the live database
file="$dbname-$(date +$dateformat).sql"
path="$dumpdir/$file"

mysqldump $mysqlopts "$dbname" > "$path"

# Upload the newly created file to Google Drive
gdrive upload --parent "$gdrivefolderid" "$path"

You’ll want to update the variables above for your setup:

dbname Name of the database to backup
gdrivefolderid Google Drive remote folder ID
rententiondays Number of days to keep local backups on the system before deletion
dateformat Date format accepted by date. Appended to backup filenames
dumpdir Directory path where local backups should be stored
mysqlopts Options to pass to mysqldump
e.g: mysqlopts=--user=root

The script first looks for any backups older than two weeks and deletes them from the directory. For all remaining backups it gzips them to save space and then creates the new database export for today. Once that file is created it then uploads to our Google Drive folder:

./db-dump 
Uploading /var/spool/db-dump/mydatabase_-2016-11-07_12:53:23.sql
Uploaded 0BwvPrE-VLdWoWjBmRTlJajRMVGM at 393.2 KB/s, total 683.5 KB

Automatic Backups

To run the backup script automatically the easiest thing to do is to place it within one of the /etc/cron.* directories. I placed mine inside /etc/cron.daily so that the script is executed by the root user once per 24 hours. If you want to be more specific with the scheduling you’ll need to edit the crontab. There’s a pretty good crontab tutorial over at nixCraft which I’d recommend reading if you’re not au fait with crontabs.

Conclusion

So there we have it, automatic database dumps uploaded to off-server storage once every 24 hours. The script itself is pretty basic and there’s a lot of room for improvement. Check out the script’s GitHub repository and see what improvements I’ve made to it already.

If there are things you think could be better, or things you think I’ve done wrong feel free to leave a comment below or fork the repository and submit a pull request.


Post Meta

Filed under: Linux
Tags: , , ,

About the author


Comments