Invision Community 4: SEO, prepare for v5 and dormant account notifications By Matt Monday at 02:04 PM
Zhana Posted May 6, 2012 Posted May 6, 2012 Hi, Is anyone using Amazon S3 for storing backups? Thank you.
FCB-ROGA Posted May 7, 2012 Posted May 7, 2012 Yes. Yes I do. Have you found a good way to push your backup from your live server to the S3 storage with a cron job that sends the backup, and removes old ones?
Grumpy Posted May 7, 2012 Posted May 7, 2012 Yes. Yes I have. But, I'm not going to share my script here. But... general idea. You already have your backup SOMEHOW. Right? So, use s3cmd that you had already setup and authenticated on your server. Write a script that'll use s3cmd to move/remove the old ones. Then upload the new one. Put that script on cron job. Fini~ My script makes daily copies, weekly copies, and monthly copies.
Zhana Posted May 7, 2012 Author Posted May 7, 2012 Hi Grumpy any chance you're willing to sell the script? I am very much interested. Thank you.
Grumpy Posted May 7, 2012 Posted May 7, 2012 The primary reason I wasn't willing to share my script is due to the fact that's it's customized for me. It won't really work for you without an extensive overhaul... lol My hosting environment isn't exactly typical... Here's some tutorials I found... with scripts attached. https://github.com/woxxy/MySQL-backup-to-Amazon-S3 (Seems most simple. But mysql only. Could easily just add few lines to include your website files) http://atlchris.com/828/how-to-backup-your-website-to-amazon-s3-automatically/ http://www.problogdesign.com/how-to/automatic-amazon-s3-backups-on-ubuntu-debian/ (uses duplicity. Needs more installation than basic s3cmd)
Zhana Posted May 7, 2012 Author Posted May 7, 2012 Thank you Grumpy. I guess I will have to hire a coder / admin to setup for me lol.
altenerg Posted May 15, 2012 Posted May 15, 2012 I use s3 as well. My script backs up mysql nightly and nginx's html folder weekly. It compacts the IPB database folder files into a *.tgz file and also creates a single databasename.sql file for easy phpmyadmin import. Then both those files are zipped into a nightly *.zip file. I also prunes anything older than 7 days from the s3 db bucket and prunes anything older than 3 weeks from the html backup bucket. It keeps a single local copy of each previous backup. In addition I have my EC2 Linux AMI install image with snapshots sitting in wait, so in case of EC2 ever becoming unusable (not very likely haha) I can simply fire up clone (LEMP) server in seconds. I was using ELB but want to keep cost low so reverted to 1 instance for now as PHP-FPM via sockets with Nginx is a beast! I use RDS for larger non IPB sites' db, it saves all the stress. :) But, I think backing up to s3 is one of the best solutions for anyone whether or not they use Ec2. Go for it!
tomturd2 Posted May 15, 2012 Posted May 15, 2012 here yo go, create a new file called 'backup2s3.sh' and put this in there: (it could be tidied up a lot, but you get the idea. also it only backs up DBs) #!/bin/sh # List Databases to Dump DBS="$(mysql -u [user] -p[password] -Bse 'show databases')" for db in $DBS do DMPFILE="$db"_$(date +"%Y%m%d").sql MONTH=$(date +"%Y%m") echo -e "Dumping $db \n" mysqldump -u [user] -p[password] $db > $DMPFILE gzip $DMPFILE s3cmd put $DMPFILE.gz s3://[s3bucketname]/$MONTH/$DMPFILE.gz rm $DMPFILE.gz done On s3 you can set an expiration period - set this to however many days/weeks of backups you want to keep.
Recommended Posts
Archived
This topic is now archived and is closed to further replies.