Are Coders Getting Careless?
The security holes exploited by Code Red and Nimda, worms that experts said had the potential to knock the entire Internet offline, attacked long-standing vulnerabilities in Microsoft IIS Web Server caused by an error made through poor code writing: the buffer overflow.
A buffer overflow occurs when the amount of memory assigned to a specific application or task is flooded, often with unpredictable results. Frequently, however, buffer overflows allow attackers to run any code they choose on a target PC.
When Code Red and Nimda struck last year, many security experts were left to wonder why the vulnerabilities hadn't been patched. A better question might be why buffer overflows, a class of error known and avoidable for at least 30 years, still crop up with great regularity?
Buffer overflows have caused problems in 2002. Microsoft
Combined pressures from companies, consumers, educators, and students are merging to create a situation in which techniques that could stop persistent security holes, like buffer overflows, are known but aren't used or taught sufficiently. Consumers say they want security, but instead buy cheaper products with more features. As a result, vendors have
This matrix of factors, one that touches on nearly every aspect of the technology industry, makes buffer overflows a common problem. The Software Engineering Institute (SEI), an organization that studies software building processes, maintains a database of security vulnerabilities in which buffer overflows account for more than 15 percent of all vulnerabilities and for 75 percent of the top ten most serious vulnerabilities, according to Shawn Hernan, SEI team leader for vulnerability handling.
The question of how to create more secure products is one that vexes even the largest software companies. In January, prodded by a memo from cofounder Bill Gates,
"We've done a terrific job at [adding features], but all those great features won't matter unless customers trust our software," Gates wrote in his memo. "So now, when we face a choice between adding features and resolving security issues, we need to choose security."
One place to look into the question of why buffer overflows are still so common is where programmers are trained: colleges.
"One of the problems is that the educational establishment generally [doesn't] teach secure programming at the undergraduate, or even graduate, level," said the Software Engineering Institute's Hernan.
A number of colleges known for their computer science programs offer, at best, only the most basic security classes, and few on writing secure code.
At Carnegie Mellon University (CMU), "I don't think security [as] defense from malicious attacks is ever explicitly covered" in the first two years of an undergraduate degree, said Jim Morris, dean of the School of Computer Science.
Though CMU offers a "fairly rigorous, but broad" undergraduate computer education, there are currently no security courses at that level, Morris said. Students spend the first three to four semesters working on foundational issues such as data structures and functional programming, he said.
Carnegie Mellon is not alone. Syracuse University offers no specific security courses to undergraduates, though security is a topic covered in four to five courses at that level, according to Steve Chapin, associate professor of electrical engineering and computer science and director of the Center for Systems Assurance.
Stanford University, however, offers a program called the Security Lab, where up to ten faculty members work with undergraduates on security issues, said Dan Boneh, assistant professor of computer science and electrical engineering. Part of the lab's mission is to teach students about secure code and how to maintain it, he said.
Despite the lack of focus on security at the undergraduate level, graduate programs at all three schools address security. In addition, the U.S. National Security Agency runs a program called Centers of Academic Excellence in Information Assurance in Education that helps colleges and universities bolster security curricula through training.
The paucity of secure coding education for undergraduates means that colleges and universities bear some responsibility for the persistence of security holes, said Syracuse's Chapin.
Despite this, students are becoming increasingly drawn to security, said Stanford's Boneh. Not only are his students starting to see security as an important subject, but they are also being asked about security in job interviews, he said.
Shawn Hernan of the Software Engineering Institute sees the matter differently, however. Industry is pushing colleges to make sure their students can write code to correctly achieve a goal the first time, without a concern for security, he said.
"Rarely are programs examined for their quality," he said, adding that they are instead judged on what a program does and if it meets specification.
The Software Engineering Institute has published a list of software development techniques that, Hernan said, can help developers avoid these mistakes and write more-secure code. A more widespread adoption of those techniques will help cut down on common vulnerabilities, he said.
Security tests and audits that might be performed otherwise are overlooked due to market pressures, the drive to ship, and the need to provide features that the competition doesn't, said Stanford's Boneh.
Such an auditing process is a crucial step in the development of secure software, according to Izhar Bar-Gad, chief technology officer of application security firm Sanctum. Developers ought to go through three security audit phases, he said: developer audits, quality assurance audits, and external audits.
"Technology companies are responding to consumer demand," he said. "What people say they want is security. What they buy is something that works right out of the box," whether it's secure or not.
Consumers should also exert pressure, says Arthur Wong, chief executive officer of SecurityFocus.
"I've never heard anyone ask a vendor, 'Is your software secure?' unless you're selling a firewall," Wong says. "If consumers ask for it, there will be vendors out there who fulfill that need. What we shouldn't accept as consumers...are the standard vulnerabilities being found in software."
Carnegie Mellon's Jim Morris agreed, pointing to other industries, such as the automobile industry, in which consumer action led to the government imposing stricter safety standards.
The software development industry needs to create feedback loops for developers, so that they learn about and learn from their mistakes, Hernan suggests. He points to the example of civil engineers, who have apprenticeships and take legal responsibility for their products, he said.
Many security mistakes made in software are "the computer equivalent of forgetting to put the last screw in the hinge (on a door)," he said. Such mistakes would never be acceptable in the physical world and applying disciplines and standards from the physical world would help to cut down on them in software, he added. But the problems run deep, he said.
"There are so many things that need to change about software in order for this problem [of secure software] to substantially improve, real solutions are years off," he said. "These are deep, systemic problems that don't lend themselves to trivial solutions."