One of the major advantages for people to move to the cloud is cost savings; and a way to improve those savings even more is by caching your content.
Whenever you have a server that creates dynamic content for each user, it must create that webpage for each user who visits that site. This can be a burden on your server. It can slow performance and possibly affect reliability. People can use caching mechanisms to make sure servers run as efficient as possible.
In the cloud, there are three separate layers on which you can cache: on the server, load balancer and Content Delivery Network (CDN).
There are two main agents used to cache content on your server: varnish and memcache. Varnish is an http accelerator, which can help load your site faster; memcache, on the other hand, will help more database driven sites. Each method will help serve up your dynamic content to your users without having to “pull” that information each time a person visits your site.
A second way to cache is on the load balancer layer by storing some of the content from your site there. The benefit is that every time someone makes a request from your site, your server doesn’t have to serve that content dynamically; it can actually serve from the load balancer layer, which takes strain off your server.
The final, and most common, caching layer is a CDN, such as Rackspace Cloud Files. CDNs cache static content on a network of geographically dispersed servers. This means that when your users access the static content on your site, they no longer retrieve that information from your server, but instead get that file from a machine in the CDN that is in their geographic area. This results not only in faster loading times for the end user, but also in less load on your server.