Invision Community 4: SEO, prepare for v5 and dormant account notifications By Matt Monday at 02:04 PM
Nervosa Posted August 20, 2009 Posted August 20, 2009 Because there is always more then one way to do something, I'd like to know ways of backing up a large database on an active site. I currently stop mysql, copy the folder somewhere else, start mysql. This takes my forums off line for a few and shows the dreaded IPS Error (because there is no mysql running) Is there a better way to do this?
RobertMidd Posted August 20, 2009 Posted August 20, 2009 I use http://dev.mysql.com/doc/refman/5.1/en/mysqldump.html
rct2·com Posted August 20, 2009 Posted August 20, 2009 I use a shell script (scheduled by crontab) that temporarily replaces the forum index.php with a 'come back later' page then runs mysqldump, before reinstating the index.php page. There is a link in my sig to an article about this, although admittedly, the article is a bit out of date now.
3DKiwi Posted August 20, 2009 Posted August 20, 2009 I don't take my board offline to do it. I use the linux code as per the backup and restore instructions in the knowledge base in your account here. I log in via SSH using a free shell called "Putty". Copy and paste the backup code in and 30 seconds later my db is backed which is around 200mb in size. I then download via ftp. Restoring is similarly easy. The advantage of using SSH is you don't get page timeout errors and it's much quicker. 3DKiwi
Nervosa Posted August 20, 2009 Author Posted August 20, 2009 Excuse my ignorance. My thinking is, if the tables are big its going to take a few min for the backup to copy. During this time someone could be posting, signing up, or doing an action that writes to a table. Wouldn't this cause an error of some sorts, trying to copy a file that's being written to? Couldn't people still be accessing stuff while index.php is removed?
.Ian Posted August 20, 2009 Posted August 20, 2009 MySQL Backup in Unix or Linux If you're on a Linux/Unix machine, you can use the Command Line tools via SSH to create a database dump (backup of the mysql database) by inserting the command: If you wish to compress the above output so that it's easier to transfer around, then you can do: If you've compressed the output and would like to decompress it, then you can run: (the output will extract as databasename/databasename.sql) And to restore said backup, after you purge data out of the MySQL database:mysqldump -uusername -ppassword databasename > databasename.sqltar -cvpzf databasename.tgz databasename.sqltar -xzvf databasename.tgzmysql -h localhost -uusername -ppassword databasename < databasename.sql
Nervosa Posted August 20, 2009 Author Posted August 20, 2009 I was reading here Getting another server is something I don't want to do right now, and having my board offline for 1 min or 2 is fine.Normally, when you want to create a MySQL backup, you either have to stop MySQL or issue a read lock on your MySQL tables in order to get a correct backup; if you don't do it this way, you can end up with an inconsistent backup. The article goes on to tell you how to replicate your database on another server.
rct2·com Posted August 20, 2009 Posted August 20, 2009 [quote name='Nervosa' date='20 August 2009 - 08:03 PM' timestamp='1250794994' post='1846941'] Excuse my ignorance. My thinking is, if the tables are big its going to take a few min for the backup to copy. During this time someone could be posting, signing up, or doing an action that writes to a table. Wouldn't this cause an error of some sorts, trying to copy a file that's being written to? Couldn't people still be accessing stuff while index.php is removed? As I say, my article is a little out of date. The scripts I now use replace every php file that WRITES to the database with a 'please come back later' page. In the case of IP.Board, replacing index.php stops anybody carrying out any IP.Board database reading/writing, such as posting, registering, PMing etc. Other parts of my site may have pages that access other databases, but not the IP.Board database.
thompsone Posted August 30, 2009 Posted August 30, 2009 [quote name='isdoo' date='20 August 2009 - 11:04 AM' timestamp='1250795041' post='1846943'] MySQL Backup in Unix or Linux If you're on a Linux/Unix machine, you can use the Command Line tools via SSH to create a database dump (backup of the mysql database) by inserting the command: If you wish to compress the above output so that it's easier to transfer around, then you can do: If you've compressed the output and would like to decompress it, then you can run: (the output will extract as databasename/databasename.sql) And to restore said backup, after you purge data out of the MySQL database: :thumbsup: Once you've had to recover from backups you learn how important they are, not just the backup file itself but the ability to recover from that file. You can make backups all day long and if you can't recover from them they are useless. Along these lines however this post is a nice clean way to dump your sql tables. I run a series of cron tasks nightly that perform these functions without the need of taking any of the services offline. 1) Just prior to the dump I remove the old backup (I keep only 1 good copy locally as I keep a 7 day rotation completely offsite in addition to this) A) I then use scp to move the compressed file to another machine offsite which also manages a rotation of the files keeping only the last 7 days archived as our dump file is 480G in size even with compression. (Optional - Advanced) B) In addition to this I also use rsnapshot which handles nice neat backups. (Optional - Advanced) C) Some other ways I can imagine handling this might be pulling the dump from the server to your local machine by installing MySQL Administrator locally and setting up the backup task. (Optional - Moderate) Either way you handle it, it's still a good idea to have backups.50 22 * * * rm -f /path_to_archives/database.sql.gz 2) I run the mysqldump with compression 0 23 * * * mysqldump -u your_user -p'your_password' --all-database | gzip > /path_to_archives/database.sql.gz
Jerod Poore Posted August 31, 2009 Posted August 31, 2009 [quote name='Nervosa' date='20 August 2009 - 09:55 PM' timestamp='1250801703' post='1847004'] I was reading here Normally, when you want to create a MySQL backup, you either have to stop MySQL or issue a read lock on your MySQL tables in order to get a correct backup; if you don't do it this way, you can end up with an inconsistent backup. From my days as a sysadmin the table locks should keep the integrity of the database. With all the various flavors of MySQL and OS in the mix, temporarily denying access to one's board is the only guaranteed method. If you're tight for space, isdoo's suggestion is the way to go. tar with the -z argument will compress the file more than gzip alone. With tar it's faster and easier to backup and compress all the other files associated with IPB, so if you have it in a script it's just one more command. The tar argument --index-file=[somefilename]creates a list of all the files you backed up, which can come in handy.
Velvet Elvis Posted September 17, 2009 Posted September 17, 2009 I use 'mysqldump -u root -p --opt databasename | gzip > databasename.day.month.year.gz' I used to use backupninja on my old server to automate everything including downloads to my local PC but haven't gotten around to setting it up on my current server.
Recommended Posts
Archived
This topic is now archived and is closed to further replies.