Networking

Bandwidth Bottlenecks Loom Large in the Cloud

Optimizing the Network for Data Backup

data
John Lax, vice president of information systems for Washington, D.C.-based International Justice Mission (IJM), credits WOCs for enabling the bandwidth-challenged global nonprofit's move to the cloud.

The IJM, a human rights agency that rescues children from sex trafficking and slavery, has 500 employees and 14 field offices in 10 countries around the world. Lax says many employees endure the triple challenge of incredibly low bandwidth (e.g., 512Kbps), frail connections that frequently drop and expensive fees (a 256Kbps link in Uganda costs $1,200 per month).

Introducing the cloud to remote areas had to be a carefully construed plan that would take these issues into account. The organization wanted to maximize the length of time the link stays active without interruption, he explains.

Lax decided the best use of the cloud for the farthest-flung workers would be for backups. "We no longer wanted manual intervention of changing and tracking tapes," he says. The field offices each have installed Riverbed's Whitewater cloud storage appliance that connects to another Whitewater appliance in IJM's Richmond, Va., data center.

Data, such as case workers' sensitive documentation about children, is encrypted, de-duplicated and compressed to speed transfers. The data center's Whitewater appliance is also used with a Whitewater virtual appliance to back up and archive data on Amazon's S3 Cloud Service.

Lax says the appliances have resulted in a six-fold reduction of traffic, reducing bandwidth costs and ensuring shorter, more accurate backup windows. Also, if users accidentally delete a directory, they can retrieve it from the built-in buffer in 12 seconds vs. the previous 36 hours necessary to recover from tape. In total, the IJM has been able to back up 5.5 terabytes of data to the cloud, ensuring the security and integrity of the group's work.

Syncing Across Data Centers

While optimization appliances can go a long way toward combatting bandwidth bottlenecks, IHG's Conophy took a different tack. Like Lax, Conophy has had to architect his cloud network to support users from the far reaches of the globe. The company has three primary data centers in Georgia, Virginia and California. Secondary data centers are located in Dubai, Shanghai, Singapore and Sydney. Conophy says they are strategically situated near users for an optimal and speedy user experience.

Although keeping data completely synchronized across all data centers would be impossible without a major investment, Conophy wanted to get close. Guests relying on a variety of sources, including smartphones, tablets and websites, are expected to conduct 50 billion transactions annually within the next decade. "Our guests connect to us via multiple channels and devices, and our challenge is to maintain data synchronization of their reservations and guest profiles while growing to meet the transaction challenge," Conophy explains.

Using the Terracotta Enterprise Suite, IHG quickly and efficiently syncs up Java Virtual Machines. Caches are distributed across data centers. "It's basically a repository that lets us do data shifting from a primary database across multiple nodes," he explains. The result, he says, is from 50 to 100 times faster access than traditional methods, good indexing and integrity from one data center to the next.

Looking Within

Sometimes, Conophy says, "You create your own data storm." This can happen if companies put an application in the cloud that has to frequently access an internal database. The back-and-forth can quickly overburden pipes and cause performance problems.

To avoid this, Enterprise Management Associates' Frey recommends using tools to map application interdependencies and devising cloud strategies to accommodate them. "Get some measure of what applications are drawing off each other and then you can move them closer together vs. taking a hit on latency," he says.

Much like an internal network, bigger bandwidth sometimes is the only solution to congestion. If you're suddenly pushing all of your users out to cloud-based services such as Google Apps, then you're going to need fatter pipes from your building and remote offices. This reality has to be weighed when deciding to head to the cloud.

Although bandwidth has mostly taken a backseat to other cloud-related considerations, analyst Lanowitz says now is the time to bring it to the fore. "The risk for failure is growing because the company brand is now inextricably linked to the technology running," she says. That said, companies can't hand over bandwidth quality control to external providers -- it's something, she says, that must remain in-house.

Subscribe to the Business Brief Newsletter

Comments