There are five essential pillars of cloudiness. In this recurring blog series, we’ll count down from No. 5 to No. 1. Last time, we looked at modular design. In this fifth and final post, we discuss parallel computing.
Parallel is a term straight from the high school nightmare: geometry class. Classically, it means two lines that are always equidistant to one another and therefore never cross. When applied to computing, these lines are usually metaphors for processes or computation work being done. Parallel computing takes advantage of every trick in the book to get more work done in less time. Fred Brooks (of The Mythical Man-Month fame) may and should disagree, but throwing more people at a manufacturing problem will obviously decrease the time it takes to assemble the final product, right?
Applications, especially web applications, have been built with synchronization baked in. This means that every action that is requested of the web application is lock stepped through the entire process. For example, when submitting a request to share text on a pastebin service, the web front-end taking pasted text needs to get a response from the database, the text parser and the URL shortener. Each one of these seemingly non-overlapping processes needs to converge before we, as the users of this service, get a response.
If it feels like this design goes against some of the other pillars we’ve already discussed (i.e. modular, agile); excellent! By using the other pillars, we’ve started to achieve parallelization already. By moving work into non-overlapping sections we can do more work at the same time rather than having some work wait for other work.
Let’s quickly visualize a more accessible example. Bucket brigades are used to move large amounts of sand bags to create dikes in the event of a flood. The process works quite effectively because every other person usually has a bag at the same time. Thus the number of bags being transferred at any given time is at least half the number of people in the bucket brigade. If we only allowed one sandbag through the brigade at a time, the process would obviously be much slower (have lower throughput) but be more indicative of an application designed with synchronization. This is also an excellent example of horizontal scaling to go with the modular design we desired in the previous example.
Achieving parallelism in an application has been studied for decades and is still being advanced all the time. Parallel computing requires communication of data between compute processes. Several communication patterns exist, but common ones used in cloud applications are one-to-one, one-to-any and one-to-all, all of which can easily be implemented using message queues (i.e. RabbitMQ, IronMQ, &c).
Parallelism buys us an incredible boon and creates the ability to service even more requests or more computation in less time. That’s why we saved it for last.
This concludes our tour through the five essential pillars of cloudiness; guidelines for designing applications for the cloud. It’s been a fun ride. Be sure to check out the other pillars and head over to the Rackspace DevOps Blog for deeper technical dives into each one. Now get out there and start building great apps on the cloud.
Alex Brandt, a Technical Trainer with Rackspace University, contributed to this article.
Here’s a recap:
The Pillars Of Cloudiness:
- Pillars Of Cloudiness: No. 5 – Security
- Pillars Of Cloudiness: No. 4 – Agility Returns
- Pillars Of Cloudiness: No. 3 – Scaling Horizontally
- Pillars Of Cloudiness: No. 2 – Modular Design
Technical Deep Dives: