Use Cases Show How Companies Use AWS to Build and Grow

At its New York Summit last week, AWS suggested companies think of the impressive new services it rolled out as an ever-growing tool set smart companies can use to build and grow.

The public cloud giant hosted several customers, each of which presented a fascinating use case. For those who weren’t able to attend, I’ve rounded up three that offer insights into how companies do build and grow with AWS: a massive AWS solution built from scratch, a digital transformation story and a public sector deployment, plus a recap of Rackspace’s session on the Aurora database, AWS’s fastest growing service ever.

Scaling with AWS

AWS CTO Werner Vogels, sporting a Fortnite t-shirt, introduced Epic Games, maker of Fortnite, which has become a pop culture phenomenon with millions of concurrent players. The company has grown over a hundredfold in nine months, according to its Director of Platform, Chris Dyl, with its AWS infrastructure scaling along with their global needs. The dramatic growth of Fortnite in that time meant Epic needed to process and deal with two petabytes of data each month – generated from its 125 million players. Dyl said the company is “all in on AWS,” using over 100 AWS services to meet its diverse needs. Dyl described the advantages of AWS as elasticity, scalability and global access.

Epic Games uses a plethora of business intelligence services to serve their telemetry needs, with an S3 Data Lake and many AWS services. The company spans 24 AWS availability zones around the world, so needed an analytics ingestion engine capable of handling and processing large volumes of data (92 million events per minute) with both a real-time pipeline and a batch pipeline.

It also needed to be able to monitor service health from the point of view of the client, understand user experience and function as an early detection system. Analytics are used to improve business outcomes, such as running live tournaments, setting up custom rules and custom game environments, driving KPIs, calculating ARPU, and for telemetry data and game analysis design for identifying areas for improvements. The AWS toolset used to achieve these needs created an analytics pipeline managing this massive ingestion of data.

Dyl asked the audience to join him on this journey with, “See you on the battle bus” – a statement I am sure a number of you reading this know, or at least your kids probably do!

The power of digital transformation

The digital transformation of 21st Century Fox was described by its Chief Technology Officer Paul Cheesbrough, who discussed how the global media company uses AWS to enable large-scale business transformation. Using more than 100 AWS services, 21st Century Fox is driving innovation across its supply chain, data platforms and consumer product experiences. The company has reduced its datacenter footprint from 74 facilities down to four since deploying to AWS, improved its security posture and saved 25 percent on its IT spend — all while improving speed to market.

21st Century Fox uses AWS Athena and AWS Rekognition, along with other services, in innovative ways to deliver content globally, replacing the time-consuming approach of physically shipping media content around the world.

The general manager of AWS Machine Learning Services, Dr. Matt Wood, reviewed initiatives in the heavily emphasized Machine Learning space. The company’s goal is to put machine learning in the hands of every developer. He noted that eight out of 10 of the company’s machine learning workloads are now running on AWS.

A local tale, with a public twist

The third customer story I want to share is about the Depository Trust and Clearing Corporation, or DTCC, a post-trade financial services company providing clearing and settlement services to the financial markets, which processed more than $1,610 trillion dollars in security transactions in the past year.

DTCC leaders wanted to move to the cloud, but had to meet significant regulatory and compliance requirements to do so. DTCC needed to create a trade warehouse for derivatives and provide transparency and public access to data related to these transactions. Its system handles a huge volume of data and they needed to allow end-user access. The company’s challenge included balancing system performance and cost, while maintaining resiliency across multiple availability zones and meeting significant global compliance requirements.

DTCC needed to answer four main questions with its solution:

  • Can we handle the unknown scale of public demand?
  • How do we manage costs?
  • How fast can we deploy?
  • Can we use the public cloud for this?

It turned to AWS, resulting in a solution that resulted in significant cost savings, enhanced search capabilities over tape and met regulatory requirements. In order to further unlock data capabilities and improve processing of exceptions, DTCC turned to Lambda, Redshift and Aurora (and other services) to solve many of their data ingestion and analytical challenges — all while maintaining the highest standards for resiliency expected of the financial industry.

The evolution of Aurora

Vogels returned to the stage to describe AWS’ efforts to build cloud-ready databases, particularly Aurora – its fastest growing service ever. Aurora was designed to exploit modern cloud capabilities, easily replicate over at least three availability zones, be highly durable and, of course, have a different licensing model and cost structure than traditional non-cloud originated databases.

A series of enhancements made in the last six months to help Aurora meet customer needs was impressive and included:

  • Amazon Aurora Backtrack – Allows users to go back to a previous database transaction from any point in time
  • Amazon Aurora Multi Master – Improves database performance by introducing multi-master replication.
  • Amazon Aurora Serverless – Reduces the difficulty of managing unpredictable workloads by using a system that allows for on-demand autoscaling with no need to provision instances.

AWS has migrated more than 80,000 unique relational databases using its AWS Database Migration Service.

How Rackspace enhances Aurora

Rackspace, an AWS premier consulting partner and an Amazon Aurora validated partner, hosted an extremely well-attended session on understanding high availability on Aurora during the summit.

We have been focusing on Aurora, as an ever-growing number of our customers are utilizing this database with our end-to-end services supporting those deployments. Read more about Rackspace End-to-End Management of Aurora and how we help clients architect and deploy this service to maximize its potential.

The standing room only crowd heard from Rackspace AWS tech evangelist Eric Johnson on launching and configuring a cluster on Aurora to ensure that high availability and performance requirements are met. He also covered how to extend the replication patterns, how to choose the right end points to optimize writes and reads, and the future of Aurora – serverless.

David Lucky is a product marketing expert on the Public Cloud marketing team at Rackspace focusing on Fanatical Support for AWS and for the Alibaba Cloud. He works with engineering and sales teams to identify and prioritize customer requirements to migrate and securely manage their cloud deployments. David came to Rackspace from Datapipe where as director of product management for six years he led product development in building services to help enterprise clients leverage managed IT services to solve complex business challenges. David has unique insight into the latest product developments for private, public and hybrid cloud platforms and a keen understanding of industry trends and their impact on business development. He holds an engineering degree from Lehigh University and is based out of Jersey City, NJ. You can follow David on LinkedIn at linkedin.com/in/davidlucky and Twitter @Luckys_Blog.

LEAVE A REPLY

Please enter your comment!
Please enter your name here