The first optimization was to replace
mydumper. It uses multiple cores and thus allows an even faster export.
But before you can install it, you have to download some dependencies:
sudo apt-get install libglib2.0-dev libmysqlclient-dev zlib1g-dev libpcre3-dev libssl-dev cmake -y
Now you can download, build and install mydumper like this:
git clone https://github.com/maxbube/mydumper.git cd mydumper cmake . make sudo make install cd .. && rm -rf mydumper
My script exports the database into a local directory, compresses the files and uploads them into the cloud. The root-harddrive is quite small, so I had to use another temporary directory:
mkdir /mnt/externehdd/tmp/mysql_backup sudo chmod a+rw /mnt/externehdd/tmp/mysql_backup
mydumper is easier than using
mysqldump (cause it uses great default values). You just have to specify the output folder and enable compression (
mydumper -c -o /mnt/externehdd/tmp/mysql_backup
If you also intend to use Amazon S3, you will have to install the amazon
awscli tools. There are some alternatives like
s3cmd, but I prefer the amazon toolchain. You can install & configure it like this:
sudo pip install awscli aws configure
The following script shows my setup. It creates a compressed mysql dump (all databases excl. mysql), uploads it to amazon and deletes the local copy:
#!/bin/bash # Basic variables BUCKET="s3://bucket" OUTDIR="/mnt/externehdd/tmp/mysql_backup" TIMESTR=`date +"%y%m%d"` mydumper -c --regex '^(?!(mysql))' -o $OUTDIR aws s3 cp $OUTDIR "$BUCKET/mysql_$TIMESTR" --recursive rm -rf "$OUTDIR/*"
It is also quite easy to automize the backup process, by using crontab. Simply add an entry to your crontab or drop the script into the daily/weekly folder.