Filed in by Robert Bates | July 20, 2011 2:48 pm
This is a guest post by Rackspace customer, Robert Bates, a Senior Developer for Cool Blue Interactive in Atlanta, GA.
Drupal has irrefutably become one of the major players in the content management system (CMS) market, and has one of the largest installed bases of any open source CMS. As Drupal has steadily moved from hobbyist status to enterprise solution, the data that powers these sites has become more critical than ever before. Many hosting solutions offer automated backups that snapshot your server at a fixed point in time. However, if you want a quick backup of only your database that is safely obtained through database calls, in the past you either had to build your own solution or dump the database manually. Thankfully there is a great Drupal module available that makes this painless to set up and administer.
Community-contributed modules are the building blocks that let you customize a Drupal site and extend the functionality far beyond simply editing basic pages and navigation systems. Poke any Drupal developer and ask for their “base install” and you will get a laundry list of favorite, must-have modules. I’m as guilty as the next code slinger, and have my own list of favorites. One is the excellent Backup and Migrate module for Drupal, written by Ronan Dowling (@ronan4000).
The Backup and Migrate module presents site administrators with a very easy to use and flexible interface to define destinations, profiles, and schedules for automated backups. It also provides an on-site restoration option, assuming you have a site that can still limp along; if not, the backups are all SQL-based, so normal command-line options are still available. At the time of this writing, the module ships with several backup destinations already defined:
• Server Directory
• MySQL Database (a form of replication)
• FTP Directory
• Amazon S3 Bucket
Due to bandwidth costs, many administrators use only the Server Directory destination and rely on the hosting company’s backup services to provide a full filesystem recovery. While this sounds great in theory, at some point a site’s backups need to be shipped offsite, physically or digitally, in deference to force majeure. Cloud storage is one of the fastest and easiest ways to get your Drupal site’s database backups safely offsite, and now there is a way to leverage that Cloud Files service you should already have available with your Cloud Server and Cloud Site services. You did sign up for it, didn’t you…?
If you are not familiar with the Rackspace Cloud Files service, you need to be. You can create private archives as well as publish files via Rackspace’s CDN using a public URL. Your Cloud Sites and Cloud Servers use it to automatically back server images up. And did I mention the transfer speeds between a Cloud Server and Cloud Files? Simply blazing. The storage costs are very reasonable as well considering the bandwidth and service level agreement (SLA) offered by Rackspace.
In order to take advantage of the Cloud Files service, I wound up developing a companion module for Backup and Migrate, the Backup and Migrate Rackspace Cloud Files module. It adds a new destination option to the Backup and Migrate module’s lineup for Rackspace Cloud Files, and setup only requires four pieces of information:
• Destination name. This is used locally to identify this destination when setting up backup jobs.
• Cloud Files Container. This specifies the “directory” to group the backups under in your Cloud Files account.
• Username. This is your login for your Cloud Files account.
• API Key. This is requested in lieu of your password, and is available from the Control Panel.
Once you have your new backup destination defined, you can use it anywhere in the Backup and Migrate module that requires a destination. Since the module leverages the PHP binding generously provided by Rackspace under the MIT license, the module takes advantage of almost all the features available to Cloud Files developers. In tests on a standard Linux-based Cloud Server with 2GB of memory, backups 15MB in size typically complete in under 20 seconds – which is fast considering that time includes the database dump, compression, and transfer to the Cloud Files network.
The short and sweet list – if you are running Drupal sites on Rackspace servers:
• Install the Backup and Migrate module
• Install the Backup and Migrate Rackspace Cloud Files module
• Make sure you have the Rackspace Cloud Files service set up on your account
• Configure your backups, check them over the first week, and if all is going as planned sit back and relax knowing your site’s database is in safe hands
By all means, if you have suggestions, questions, or concerns about the Backup and Migrate Rackspace Cloud Files module, please feel free to file an issue in the module’s queue and I will address it as soon as possible.
Source URL: http://blog.rackspace.com/streamlining-drupal-backups-using-rackspace-cloud-files/
Copyright ©2015 The Official Rackspace Blog unless otherwise noted.