We’ve been revisiting the concept of data gravity of late and how that impacts where applications are placed. As a quick refresher, the concept of data gravity is fairly simple. As organizations adopt and migrate infrastructure to the cloud, data that remains outside of the cloud starts to gravitate to those applications running in the cloud. As data is pulled closer to that infrastructure, it can reduce latency, increase efficiencies and speed, as well as increase application performance, all of which can positively impact the end user’s experience.
In multi-data center designs, data center managers place workloads closest to the data that is commonly accessed, minimizing the impact of latency. An application hosted in the cloud has the same considerations. The simplest technical solution is to host workloads requiring cloud-based data in the same cloud service.
Another simple solution is to co-locate your non-cloud workloads in a Cloud Exchange… Switch’s Cloud Exchange is a value-add Switch offers to its cloud provider and enterprise customers hosting equipment in their data center. Switch provides the capability of running cross connects from customer equipment to cloud providers. The closer proximity eliminates the need for dedicated circuits between a cloud provider and a customer.
Another option is to purchase on-premises cloud services… Since the data is local to the customer’s data center, data gravity doesn’t factor into application performance.
Another tip: some cloud providers are evasive when it comes to disclosing the location of their data centers. This can complicate latency issues. Wired notes, to really understand latency, you should know the answers to the following questions:
- Are your VMs stored on different SANs or different hypervisors, for example?
- Do you have any say in decisions that will impact your own latency?
- How many router hops are in your cloud provider’s internal network and what bandwidth is used in their own infrastructure?
The idea of being able to transform one’s operations through the use and placement of data is a revolutionary one. And while some processes and workloads physically are restricted to the cloud, with the likes of AWS, Google, Microsoft, and others continuing to refine what it means to physically store data, it’s making more sense for enterprises to increasingly move more workloads and applications to the cloud.
For other ways to reduce latency here are some favorite insights: