Business Software

Performance Issues Plague Cloud-Based Applications

Firefox 3.5 has arrived, and according to Mozilla Foundation developers, its major advantage is speed. The new version of the open source Web browser is the first generally available release to include the TraceMonkey accelerated JavaScript engine, which previously had been found only in the 3.1 betas.

This move is the latest volley in the rejuvenated browser wars, as browser vendors shift their focus toward improving the performance of Web-based applications. Google set the pace when it shipped Chrome with a high-performance JavaScript engine last year. Since then, Opera and Apple have both announced new JavaScript engines for their respective browsers, and even Microsoft has grudgingly worked to optimize IE8.

[ See also: Google seeks faster Web | Keep up with app dev issues and trends. Check out InfoWorld's Developer World channel and Fatal Exception and Strategic Developer blogs. ]

But browser performance isn't everything. Users experienced delays browsing major news sites in the wake of the death of pop star Michael Jackson last week, but the problem there wasn't slow browsers or even overloaded servers. According to Web monitoring company Keynote Systems, in many cases site slowdowns were caused by ad networks and other third-party content providers, whose own networks couldn't handle the increased traffic.

This incident underscores an issue of growing concern to Web developers. Modern Web apps typically draw from multiple content sources, data stores, and services, and growing interest in cloud computing will only accelerate this trend. But given all these interdependencies, can Web developers really guarantee fast, responsive user experiences? Or as the complexity of our applications continues to grow, is application performance gradually slipping out of our fingers? Are we all just throwing ourselves on the mercy of the Internet?

Web Developers' Cloud Conundrum

Making Web pages is easy, but building efficient Web applications can be deceptively tricky. Desktop software is tangible; as a developer, you can get your hands around it. To optimize its performance, you do things like eliminating memory leaks and improving the efficiency of disk access. None of this applies to Web apps, however, where developers rely on browsers to handle local resources efficiently.

Instead, Web developers are confronted with the vagaries of the network. If a user accesses a Web page that draws images from a third-party provider, the overall user experience depends on the user's browser, the user's data connection, the outgoing pipe from the Web server, the Web application software, the pipe between the Web server and the image provider, and the image provider's server software. A Web application developer is in a position to optimize only one of these.

Consider what else developers take for granted in this distributed, cloud-based model. How can you be sure that the third-party image provider takes security seriously? How can you be sure that its systems are sufficiently redundant, and that it makes backups regularly, so you won't be blindsided by any unexpected outage? You can ask, of course.

A more immediate problem lies in the ways in which external services integrate with Web pages. Most of them rely on external JavaScript, iframes, or both. Either of these techniques can block a page's onLoad event, a major factor in the user's perception site slowness. This bottleneck happens before the JavaScript code executes, so the speed of the browser's JavaScript engine makes little difference. Combine this design with an overburdened network, and it's not just third-party content but your entire application that suffers.

Subscribe to the Daily Downloads Newsletter

Comments