Jump to content

mysql dump / Cron job problem

Featured Replies

Posted

I know if this is the correct way to do it, but it's not working and I was hoping someone could help me out with this.

Also, I was wondering for large databases, would it time out since it's running from a php file?

<?php

require_once('/home/iweb/public_html/cronJob/db-connect.php'); // connect to db


$query = "mysqldump -u iweb_user -pXXXX --single-transaction iweb_forums | gzip > /home/iweb/public_html/_db-backup/iweb_forums.sql.gz";


$result = @mysql_query($query); // run the query

?>

If your doing this as a CRON why not just run it via console version instead of a php script?

  • Author

I'm not really sure how. I'm new to this cron job thing and I thought using cpanel would be easier for me to manage.

I found that cron's would hang and lay dormant in the system memory when trying to backup massive databases. Eventually causing servers to crash due to a shortage on memory.

  • Author

Any solutions?

if you have root SSH access

Step 1: Run this

 

crontab -e

Step 2 Run this (editing path and username/password and databases as desired) this will run it at midnight based on your server time



0 0 * * * mysqldump -u mysqluser -h localhost --all-databases | gzip -9  > alldb.sql.gz > /dev/null 

  • Author

Yes, I have SSH :D

What does this command do?

crontab -e



I have more then one database, for different things other then the forum, is there a way I can have it just back up one database? Also wouldn't I need something like -p for the password in there?


Yes, I have SSH :D



What does this command do?


crontab -e



I have more then one database, for different things other then the forum, is there a way I can have it just back up one database? Also wouldn't I need something like -p for the password in there?



Puts it into an edit mode, and since your console user should have root permissions to mysql than you shouldn't need to specify a user name/password individual, but in short just replace the syntax that your php script was using for mysqldump with what your putting it in the cron command line

I suggest using Cpanel if you have it to set up Cron jobs. This then avoids the Linux mumbo jumbo.

3DKiwi

  • Author

Thanks Robulosity!

If I wanted to put more then one Cron, I would only have to put another line under?

0 0 * * * mysqldump -u root -h localhost  --single-transaction DB1 | gzip -9  > db1.sql.gz > /dev/null

0 1 * * * mysqldump -u root -h localhost  --single-transaction DB2 | gzip -9  > db2.sql.gz > /dev/null



One last question, what does the -9 do for "gzip -9"?
(sorry for all these questions)

  • Author

I suggest using Cpanel if you have it to set up Cron jobs. This then avoids the Linux mumbo jumbo.



3DKiwi





Yeah, I tired to set one up with cPanel (see the first post), but it wouldn't work.
:(

Ask your host to set it up for you. If they are any good they should do it for free.

3DKiwi

  • Author

Yeah, they pretty good, i'm hosted by Servint and i'm very happy with them. I just feel like I ask too many questions sometimes and don't want to bother them too much, so I figured I come on here and see if anyone has time to shad some light on a few issues I been having.

Anyway, I already got this setup via SSH, but I set it to midnight to backup, I'll check back in an hour to see if it went through. :)

Thanks guys!

You also don't want to set up the cron to run multiple back up jobs in tandem, this can and likely will cause MYSQL to go "WTF Mate" and grind to a halt, or just peg your CPU and more or less kill your VPS.. depending on how big the data bases are

there's a plugin burried in cpanel thats 3rd party you can install that should help out... Or do what I do go install webmin (which from all my testing does NOT break cPanel/WHM at all) from http://webmin.com/ and use its UI to set up cron jobs.. this way no linux time and it allows you some level of control with out having to SSH in.. It can be run via SSL or HTTP by editing the configuration file..

and when in doubt about the command switches just go gzip --help :P

  • Author

Thanks Robulosity, Ill keep that in mind.
:)

Archived

This topic is now archived and is closed to further replies.

Recently Browsing 0

  • No registered users viewing this page.