Invision Community 4: SEO, prepare for v5 and dormant account notifications By Matt Monday at 02:04 PM
SG Staff Posted January 24, 2013 Posted January 24, 2013 Has anyone set anything like this up? I wanted to do this but I am not quite getting how to set it up. I want to automate the process so I don't have to think about it at all. Doing some sort of incremental backup would be best, but I would even take a full backup once a week if that's what it has to be. This doesn't have to be to S3 either - it could be to another web server or something. Thanks!
Grumpy Posted January 24, 2013 Posted January 24, 2013 http://community.invisionpower.com/topic/362214-amazon-s3-for-backups/
handsoffsam Posted January 25, 2013 Posted January 25, 2013 This is the script that I use for this. It works well for me: ===== !/bin/bash BACKUP_DIR="/backups" MYSQL_FILENAME="$BACKUP_DIR/prod-$(date +%m%d%Y).sql" DOCROOT_FILENAME="$BACKUP_DIR/prod-$(date +%m%d%Y).tar.gz" ETC_FILENAME="$BACKUP_DIR/etc-$(date +%m%d%Y).tar.gz" BACKUP_HISTORY="365 days" echo "Creating backups..." mysqldump -u mysql_username --password=mysql_password mysql_database > $MYSQL_FILENAME gzip $MYSQL_FILENAME pushd /opt/www tar cfz $DOCROOT_FILENAME prod/ popd pushd /etc tar cfz $ETC_FILENAME php* nginx* http* sphinx* popd s3cmd put ${MYSQL_FILENAME}.gz s3://yoursite-backups/ && rm -f ${MYSQL_FILENAME}.gz s3cmd put $DOCROOT_FILENAME s3://yoursite-backups/ && rm -f $DOCROOT_FILENAME s3cmd put $ETC_FILENAME s3://yoursite-backups/ && rm -f $ETC_FILENAME echo "... done." echo echo "Cleaning up old backups, deleting files older than $BACKUP_HISTORY..." s3cmd ls s3://yoursite-backups/ | while read -r line; do createDate=`echo $line|awk {'print $1" "$2'}` createDate=`date -d"$createDate" +%s` olderThan=`date -d"-$BACKUP_HISTORY" +%s` if [[ $createDate -lt $olderThan ]] then fileName=`echo $line|awk {'print $4'}` if [[ $fileName != "" ]] then echo "... removing $fileName ..." s3cmd del "$fileName" fi fi done echo "... done." ===== robert
.Nuno. Posted January 25, 2013 Posted January 25, 2013 I used to use S3 but i like to rsync data and just got a 25GB VPS for this .. it's cheaper and I have full control.
handsoffsam Posted January 25, 2013 Posted January 25, 2013 I used to use S3 but i like to rsync data and just got a 25GB VPS for this .. it's cheaper and I have full control. The problem with rsync is that you're only getting one copy of the data. That's not a really good backup strategy. It is not uncommon for corruption to take more than a day to discover, or perhaps for you to want something that was deleted a week ago back, or whatever. There are rsync-based solutions that do cover this, but I'm not sure if you're using them. Storage is really cheap, even cheaper with Amazon's Glacier functionality. There is basically no reason not to keep daily backups going for a year or so. IMHO. robert
.Nuno. Posted January 25, 2013 Posted January 25, 2013 The problem with rsync is that you're only getting one copy of the data. That's not a really good backup strategy. "rsync" was The word used but off course this isn't the way of doing backups. I was answering to this paragraph :smile: This doesn't have to be to S3 either - it could be to another web server or something. I use rdiff-backup locally and remote as a incremental solution I use rsync locally and remote so I can have a "fresh" copy with one hour at most All this is replicated at home :smile:
stoo2000 Posted January 25, 2013 Posted January 25, 2013 This is the script that I use for this. It works well for me: <snip> robert Don't forget you can use PIGZ (http://www.zlib.net/pigz/) to make the archive part even quicker.
Recommended Posts
Archived
This topic is now archived and is closed to further replies.