Jump to content

Amazon S3 for backups

Recommended Posts

Yes. Yes I have.

But, I'm not going to share my script here.

But... general idea.

You already have your backup SOMEHOW. Right?
So, use s3cmd that you had already setup and authenticated on your server.
Write a script that'll use s3cmd to move/remove the old ones. Then upload the new one.
Put that script on cron job. Fini~

My script makes daily copies, weekly copies, and monthly copies.

Link to comment
Share on other sites

The primary reason I wasn't willing to share my script is due to the fact that's it's customized for me. It won't really work for you without an extensive overhaul... lol My hosting environment isn't exactly typical...

Here's some tutorials I found... with scripts attached.
https://github.com/woxxy/MySQL-backup-to-Amazon-S3 (Seems most simple. But mysql only. Could easily just add few lines to include your website files)
http://www.problogdesign.com/how-to/automatic-amazon-s3-backups-on-ubuntu-debian/ (uses duplicity. Needs more installation than basic s3cmd)

Link to comment
Share on other sites

I use s3 as well. My script backs up mysql nightly and nginx's html folder weekly. It compacts the IPB database folder files into a *.tgz file and also creates a single databasename.sql file for easy phpmyadmin import. Then both those files are zipped into a nightly *.zip file. I also prunes anything older than 7 days from the s3 db bucket and prunes anything older than 3 weeks from the html backup bucket.

It keeps a single local copy of each previous backup.

In addition I have my EC2 Linux AMI install image with snapshots sitting in wait, so in case of EC2 ever becoming unusable (not very likely haha) I can simply fire up clone (LEMP) server in seconds. I was using ELB but want to keep cost low so reverted to 1 instance for now as PHP-FPM via sockets with Nginx is a beast!

I use RDS for larger non IPB sites' db, it saves all the stress. :) But, I think backing up to s3 is one of the best solutions for anyone whether or not they use Ec2.

Go for it!

Link to comment
Share on other sites

here yo go, create a new file called 'backup2s3.sh' and put this in there:

(it could be tidied up a lot, but you get the idea. also it only backs up DBs)


# List Databases to Dump

DBS="$(mysql -u [user] -p[password] -Bse 'show databases')"

for db in $DBS


   DMPFILE="$db"_$(date +"%Y%m%d").sql

   MONTH=$(date +"%Y%m")

   echo -e "Dumping $db \n"

   mysqldump -u [user] -p[password] $db > $DMPFILE

   gzip $DMPFILE

   s3cmd put $DMPFILE.gz s3://[s3bucketname]/$MONTH/$DMPFILE.gz

   rm $DMPFILE.gz


On s3 you can set an expiration period - set this to however many days/weeks of backups you want to keep.

Link to comment
Share on other sites


This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
  • Create New...