Cyberattack Lessons
The attack apparently used malware to gain control of 50,000 PCs, combining the systems to create a virtual army of sorts. At its peak, the denial of service attack pounded sites’ servers with as much as 20 to 40 gigabytes of data per second — a full 10 times the amount of data typically transmitted in such a scenario.
Even given the massive scope of the attack, though, many observers are left wondering how a government Web site could have been unprepared. After all, federal spokespeople have gone on the record as saying these denial-of-service attacks are attempted on a daily basis year-round. The White House and the Department of Homeland Security, in fact, were among the sites attacked in the recent incident. So why did they remain unaffected, while others buckled under the pressure?
The simple truth is that those sites are likely common targets and consequently well-versed in handling even the most extreme attempts at server overload. More specifically, however, researchers believe one simple piece of missing knowledge may have made the difference between their resilience and the other sites’ collapses.
“Too many federal agency security people did not know which network service provider connected their Web sites to the Internet,” explains Alan Paller, director of research at the SANS Institute, a security research organization.
As a result, Paller says, the agencies were unable to reach their providers and have them filter out the bad traffic — a tactic that could have kept the servers from buckling under the tremendous pressure.
Challenges and Solutions
Of course, other factors were at play as well: The zombie computers used in the attack were located all over the world, the SANS Institute found, including within America. What’s more, the active systems shifted from moment to moment, making them even more difficult to identify.
“The attacks have become increasingly sophisticated since the end of last week,” Paller says. “It started as a flood that was easy for network service providers to filter, and then went through at least two increases in sophistication so that the flood look[ed] more and more like legitimate traffic.”
Still, the fact that certain Web sites were able to withstand the pressure suggests that others could have done the same. Paller and his team believe federal security officials will now move to set up a private database of government Web sites and their network providers. That way, should a similar attack happen again, the sites’ administrators could act quickly to block as much of the malicious traffic as possible — before any servers are knocked offline.
Connect with JR Raphael on Twitter (@jr_raphael) or via his Web site, jrstart.com.