Last updated August 14, 2012. Created by JohnNoc on May 7, 2005.
Edited by islalobo, Rodney Strong, SKrossa, colan. Log in to edit this page.

General backup best practices

  • Always back up the entire site before updating or upgrading. (It is also a good idea before migrating, copying, moving, or replacing.)
  • Date your backups. Save each one into a directory or file with a title that includes the date of the backup. You do not want to be guessing which backup is the most recent one when you are trying to recover your failed site. Panic is not conducive to a good recovery process.
  • Save a copy of each backup in a different location than your webserver. Remember, if data doesn't exist in three places, it doesn't exist at all. If your webserver crashes, then all the backup files might be gone too (see earlier reference to panic).
  • Inquire about your ISP or web host's backup policies. Most good web hosts have a backup plan that kicks in every 24 hours. If yours does not, it's up to you to back up your site daily, or find a host who will. It cannot be stressed enough that if a site's data does not exist in three places, it doesn't exist at all.
  • In addition to your web host's backups, routinely do it yourself. Periodically run a separate backup monthly, weekly, daily or whatever fits your site's needs. Gauge the backup cycle by how much of your user's data you can afford to lose without all the users revolting and starting a riot. Even if you tell yourself, "It's no big deal", it is, and you do not want to lose content.
  • Document and test your backup and restore procedure before you need it. The forums and IRC chat are full of people who want help recovering from some unforeseen fatal error, and if they knew how to restore their site using the backups they'd created, they would be in business and making money rather than waiting for help from a Drupal volunteer.
  • Restore your backups using the same method as you took the backup.

Specific backup strategy

Backing up a Drupal site involves backing up both the site's database and its files.

Backing up the database

Backing up and restoring a MySQL database using the command line is covered in the MySQL Reference Manual, in the section on Backup and Recovery. Note that the Reference Manual cited is based on MyISAM tables. If you use InnoDB tables in your database, please pay particular attention to MySQL Reference Manual, section 14.2.6. There is also a handbook page about doing a Backup and restore using bash shell scripts.

Backups can also be performed with a graphical utility such as PHPmyAdmin. There's also a FAQ on that site called "How can I backup my database or table." You can also take a look at the phpMyAdminToolkit.Dump program, which according to its home page, is a console application with similar functionality to mysqldump. The program uses the phpMyAdmin API behind the scenes.

If you want to backup a bigger MySQL Database on a hosted webspace with phpMyAdmin you could run into problems because there often is a time limit for php processing. There is a script called MySQLDumper or Bigdump that has a workaround for the limit. MySQLDumper can also generate automatic backups.

Another alternative solution is to use a module. The Backup and migrate allows you to download backups or save backups on the server on a schedule. You can specify how many backups should be kept and how often backups should be taken. Alternatives are Database Administration module and the Backup module both let you create a backup of the database tables from within Drupal itself (backup module will also backup your files.) Note: Be careful with the permissions.

If you are using a database other than MySQL, that vendor's documentation should have backup information. For a list of even more methods, consult the Handbook pages in the Upgrade Guide as well.

Backing up the core files

The root directory of your installation contains all the files that make Drupal do its magic. Sure, you *could* go get these files from Drupal.org again if you ever needed them, but having your own copies of the core files is so much faster and efficient. If you've edited anything in the core files for your specific website, having a copy of the core files becomes mandatory, since the site's changelog is part of the root directory too.

As a suggestion, everything in the root directory should be backed up monthly, but always back up the core files before an upgrade. It's been said before in other sections of the Handbook, but Best Practices means that backups of the core should be a regular habit. Do this every month as suggested, and when disaster strikes, you won't be wondering whether you've made any changes recently that have been lost forever.

Backing up the non-core files

Contributed modules and their associated third-party files should be backed up monthly as well. If you are adhering to the Best Practices outlined in the previous section above, your contributed modules will be part of the core files backup by default, since they exist inside a directory of the root Drupal installation. If you do not have the bandwidth or storage resources to back up your entire root directory each month, you must backup the /sites/all/modules/ (version 5.x - 7.x) or the /sites/example.com/modules/ (version 4.7 and lower). Failure to do this may leave you with no clear example of which modules you had, and what versions they were. Imagine trying to restore your site right now without knowing which versions of modules to download from Drupal.org. Could you do it?

There are also files that are not part of the database, the core, or the contributed modules that are important to your site. The directory you specified in Administer > Site Configuration > File System holds uploaded pictures, user's icons, and other stuff that you don't want to lose. If you are running your site in 'private' file transfer mode, chances are good that this /files/ directory is not inside your root Drupal installation, and therefore won't get backed up by the procedure outlined in "Backing Up The Core Files" above.

The /sites/ directory is another one that you should take note of, since it holds lots of things relating to your site such as logos and embedded images. Custom themes are also stored here.

Make sure that you back these two directories up weekly, or any changes made by your users might get lost forever. On high traffic sites, make daily backups of these two, and save the last 8 backups for safekeeping. If something goes wrong by accident, and you can bring your site back to the same way your users left it, you will have garnered their loyalty, increased word-of-mouth traffic for your site, and become their newest idol. You might even make it into their blog. Yes, backups can do this for you.

Looking for support? Visit the Drupal.org forums, or join #drupal-support in IRC.

Comments

I know this is not a mysql manual, but nevertheless I think it would be prudent to point out that at least you need to take your site off-line while backing up databases, as most backup mechanisms might produce erroneous backups if there are writes to the database (such as people adding new nodes/comments) during this process.

There are of course advanced methods in the mysql manual for sites that can't be taken off-line, which should also be pointed out.

@Kami - although you're technically correct - mysqldump will typically lock tables as it backs them up (depending on the arguments passed, and the data engine selected for your tables). If all that is happening to your database is the addition of extra rows, there shouldn't be too much of a problem with the consistency of backup if it runs whilst the website is live. Needless to say, you should always restore your database to ensure the backup worked.

If you need a very easy, automated and zero-technical-knowledge-required way to backup your website and database, may I suggest using the following website (which I run):

http://www.backupmachine.com/

This provides automated daily backups of your website files and database to the Amazon S3 cloud (fully encrypted).

You can also use Comcure to back up your core files for free. You might remember us as we gave an iPad away at the Denver DrupalCon. Check us out :) Would love any feedback until we get the Drupal module released.

Pass --lock-tables=false to mysqldump to dump without locking tables.

Deniz Dogan

This is the Drupal backup module I use: Backup Client-Server: http://drupal.org/project/usage/backup_client_server
It allows you to set automatic scheduled backups, send to Amazon S3, Backup single/multiple databases and compress the website files to GZip. It's a great migrating/dev setup/backing up tool.

"If you use InnoDB tables".
It appears that for MySQL 5.0, Drupal7 creates only InnoDB tables.

Just a quick point as nobody seems to have mentioned it...

Assuming you have your own server or vps and root access or similar, the fastest way of backing up your mysql database is by stopping the mysql server and then backing up the contents of the mysql data directory - typically at /var/lib/mysql on Linux systems. This approach also guaranties you get everything - no danger of an sql dump missing some vital tables or records because your query wasn't quite right, and no referential integrity issues.

Of course the site will be down while mysql is stopped, but the backup (tar, simple file copy, etc.) will typically only take a few tens of seconds (depends on the size of your database, disk speed etc.)

Assuming the small amount of downtime would be acceptable, the process can be scripted/automated. Basically:

1. Somehow show a suitable "site under maintenance" message (e.g. rename index.php and replace with a suitable alternative or use maintenance-page-offline.tpl.php).
2. Stop mysql (e.g. /etc/init.d/mysql stop)
3. cp/tar/rsync the contents of /var/lib/mysql to a suitable place - see note below.
4. Restart mysql
5. Reverse whatever you did in step 1.
6. Now that the site is back up, do whatever else is needed, e.g. gzip the new copy of the data, or send it (rsync perhaps) to another backup location.

Notes:
If you data directory is not /var/lib/mysql the correct directory can be found by looking at my.cnf (typically at /etc/mysql/my.cnf) for the datadir setting.

I often use rsync as it provides a fairly easy way of keeping incremental backups.

Step 3 - fastest way to copy - on some systems (fast processor, slow disk) it may be faster to compress while copying because mysql data will compress a lot, so reducing disk writes considerably at the cost of cpu time. Maybe even write to /dev/shm if big enough. Try comparing the speed of tar cf with tar zcf (or pipe to gzip etc. and try different compression levels to find fastest combination on your system.)

What's the best method to run a backup of the "files" directory when it contains a large number of files?

With my current hosting provider the FTP list gets truncated at 9999 files, so it is impossible to run a complete backup!

Unfortunately Drupal does not organise the files directory by itself, it just dumps everything in this single directory. There are modules available which re-organise the files directory but without having a full backup first this is a risky experiment ...

Any help or suggestions are appreciated!

Take a look at rsync - it can be set to copy only files which have changed so can be very fast. It can also backup over ssh, so you can run it on a local machine to grab a copy of your server (assuming you have a suitable ssh login there) and only the new/changed files will actually be transferred.

The contributed Backup and migrate module is the only way to go as far I am concerned although you may have to use maintenance mode for large busy sites.

Hi,

Can someone explain why it's not sufficient to FTP into my webserver and download all files?

I've just tested this out, I used XAMPP to create a local webserver, installed Drupal locally, copied over all my backed up files, fixed database name & login details in settings.php, and now I've got a local copy of my site.

What am I missing?

Thanks!
Matthew

I am a new User to Drupal. I have a couple of sites I have used Wordpress and Drupal on. I used to take backups to Amazon AWS with scripting.

One of my friends suggested me the following sites

www.failsafe.us
www.backupmachine.com

Please guide if they can be used for automated backups becuase they both offer pretty good features and It is very useful to use and modify any stuff if needed. Also failsafe offers an intelligent restore tool that UNDO's any unauthorized modification done to any of the files.

Please guide if I should use any of these.

Thanks in advance.

Kriti.T