How to automate backups of your MySQL database
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Highlight
- Report Inappropriate Content
For anyone that has a site using a MySQL database, you are probably going to want to backup your database on a regular basis. Things can go wrong like databases getting corrupted, or even deleted by human error (have done this myself) so it is always good to have a recent backup to restore to.
You can of course use phpMyAdmin to take a database dump, but this relies on remembering to do this regularly.
Instead, I have found a couple of useful free tools that I use to do an automted backup of my database once per day, so I thought I would share and do a sort of tutorial thing of how to do this on BTs' hosting.
Don't let the length of this put you off, it shouldn't take more than 20 minutes to set up and if you value your data you will agree that it is worth the effort 🙂
-----------------
Firstly, the software you will need is MySQLDumper. Download this, unzip it and upload it to a folder in your webspace eg /public/mysqldumper (needs to be in /public to work).
Browse to the folder, eg www.yourdomain.com/msqldumper and go through the setup script. You will need to know your database details for this. BT's server does not seem to automatically detect the database so you will need to enter this manually.
Once in installed, you should create the directory protection which you will see on the status information page. This will password protect the mysqldumper directory and stop people messing with it.
Go to the backup tab and run a PHP backup to make sure it works. This may take a few minutes.
Once you have verified that the PHP backup is working you now have a tool that allows you to manually backup and restore databases when you wish.
You may be happy to leave things like this, however if you do want to automate things continue reading.
-----------------
Before we can automate things, you need to get the provided perl backup script working as this is used for the automated backups.
Since BTs server only allows perl to be run from the /cgi-bin directory some preparation work is required.
FTP into /public/mysqldumper/msd_cron and move 3 .pl files inside to the /cgi-bin directory.
In MySQLDumper, go to backup > backup perl and you will see 3 headings and 3 configuration lines at the bottom of the page (6 lines in total). It is the bits in bold that we are interested in, and in particular the first 2 that are headed:
"Entry in crondump.pl for absolute_path_of_configdir"
and
"URL for the browser or for external Cron job"
Copy the line in bold directly under "Entry in crondump.pl for absolute_path_of_configdir". This will look something like /services2/webpages/y/o/yourdomain.com/public/mysqldumper/work/config/
Open the file /cgi-bin/crondump.pl in a text editor and paste this line between the quatation marks in the line that looks like:
my $absolute_path_of_configdir="";
Save the file and reupload to the server. Make sure the 3 files within the /cgi-bin/ directory have CHMOD permissions of 755 (should already have this).
Back to the MySQLDumper web interface now, and go to Configuration > Cronscript and set the field 'Path of Perl scripts:' to /cgi-bin/ and save.
Go back to Backup > Backup Perl and make sure the 2 tests pass. Now click Run the Perl Cron script and make sure it completes correctly.
Test the external URL works now by entering the URL that appears in bold under the heading "URL for the browser or for external Cron job" directly in a web browser. It will look something like http://www.yourdomain.com/cgi-bin/crondump.pl?config=mysqldumper
If all is good, you should be able to execute the backup by opening this URL. Congratulate yourself if you got this far 🙂
-----------------
OK, now to the business of automating things. BT does not provide any cronjob facillity for doing this so we need to use an external service that will access the URL regularly causing the perl script to run. For this I signed up for a free account at http://www.cronless.com
Once registered you will be emailed an activation link - make sure to click on this.
Once logged in, click on Jobs and create a new job. I would advise to go for once a day and schedule it for something like 4am when your site is less busy. The URL to use is the URL that was used previously to test the perl script worked. If you need reminded of this, you can find it in MySQLDumper under Backup > Backup Perl > URL for the browser or for external Cron job
Make sure you go into the settings tab and make sure your local time zone is selected.
If you got this far, you are pretty much done. Sit back and wait for the backups to happen. You will find them in your webspace in the folder /public/mysqldumper/work/backup and you can restore them at any time using MySQLDumper's restore facility.
-----------------
A few points about using MySQLDumper:
If you have multiple databases on the same user, the cronscript automatically defaults to backing up all databases. If you only need to backup one, you can select this from Configuration > Cronscript > Database
I would not recommend keeping every single backup made on the server as BT will not like this as it will fill up the server over time. Go into Configuration > Autodelete and set autmatic deletion to enabled. I find that keeping the last 5 backups is probably more than enough. Any older than 5 days will then be deleted every time the script is run.
There is an option to FTP the files to another server if you wish, but I don't do this as BT keeps backups that you can retrieve from anyway should the worst happen. If you do set this up, the test will probably not work as BT's PHP does not support FTP but the perl seems to so it may work.
I hope somebody finds this guide useful 🙂
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Highlight
- Report Inappropriate Content
Excellent guide, thanks.
I wish I'd read it before BT rebooted our DB server last night and lost 5 days' info, but at least having followed this we should be protected in future.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Highlight
- Report Inappropriate Content
Nice tutorial and I'm sure many will find it useful, however I was surprised to learn that BT does not provide Cron jobs. I have come across this situation before but never on a professional shared web host.
anyhow No Cron, No Cry
As Dave suggested there are lots of sites that provide an external Cron Job service and any junk email they send you is small price to pay. Alternatively for those feeling adventurous here's how other folk work around the problem.
1. Create a Google App Engine
These are a bit fiddly to set up if you have never used them before but on the flipside are rock solid reliable. Below I've included a Python snippet or you could use Java, Go or PHP depending on what you're more comfortable with.
App-name.py
import webapp2
import urllib2
from google.appengine.api import urlfetch
class MainHandler(webapp2.RequestHandler):
def get(self):
self.response.write('Fire Cron Job \n')
url = 'http://www.yourdomain.com/cgi-bin/msd_cron/crondump.pl?config=mysqldumper'
urlfetch.set_default_fetch_deadline(30)
result = urlfetch.fetch(url, headers = {'Cache-Control' : 'max-age=0, must-revalidate'})
the_page = (result.content)
self.response.write(the_page)
self.response.write(' - > ended')
app = webapp2.WSGIApplication([
('/', MainHandler)
], debug=True)
cron.yaml
cron:
- description: Database backup
url: /
schedule: every 12 hours
2. Mad Hatter Hack: because the script keeps randomly checking the clock to see if it's late
This is where you add a PHP file to something that appears on every page of the site such as the header, footer text or forum login etc, so it gets called every time a visitor or Googlebot comes to look at the site. When the script runs it reads a simple txt file that holds the time of the last vist and updates with the latest one. If the difference between the stored old time and the current time is greater than the desired amount it calls the database backup script.
