Fast(er) MySQL Backup

I am using S3 for my database backups, cause it is cheap and it worked quite well, but the database got bigger and bigger. So I tried to optimize the process, with success.

The first optimization was to replace mysqldump with mydumper. It uses multiple cores and thus allows an even faster export.

But before you can install it, you have to download some dependencies:

sudo apt-get install libglib2.0-dev libmysqlclient-dev zlib1g-dev libpcre3-dev libssl-dev cmake -y

Now you can download, build and install mydumper like this:

git clone
cd mydumper
cmake .
sudo make install
cd .. && rm -rf mydumper

My script exports the database into a local directory, compresses the files and uploads them into the cloud. The root-harddrive is quite small, so I had to use another temporary directory:

mkdir /mnt/externehdd/tmp/mysql_backup
sudo chmod a+rw /mnt/externehdd/tmp/mysql_backup

Using mydumper is easier than using mysqldump (cause it uses great default values). You just have to specify the output folder and enable compression (-c):

mydumper -c -o /mnt/externehdd/tmp/mysql_backup

If you also intend to use Amazon S3, you will have to install the amazon awscli tools. There are some alternatives like s3cmd, but I prefer the amazon toolchain. You can install & configure it like this:

sudo pip install awscli
aws configure

The following script shows my setup. It creates a compressed mysql dump (all databases excl. mysql), uploads it to amazon and deletes the local copy:


# Basic variables
TIMESTR=`date +"%y%m%d"`

mydumper -c --regex '^(?!(mysql))' -o $OUTDIR
aws s3 cp $OUTDIR "$BUCKET/mysql_$TIMESTR" --recursive
rm -rf "$OUTDIR/*"

It is also quite easy to automize the backup process, by using crontab. Simply add an entry to your crontab or drop the script into the daily/weekly folder.