Kevin Finisterre isn't the type of person you expect to see in a nuclear power plant. With a beach ball-sized Afro, aviator sunglasses and a self-described "swagger," he looks more like Clarence Williams from the '70s TV show "The Mod Squad" than an electrical engineer.
But people like Finisterre, who don't fit the traditional mold of buttoned-down engineer, are playing an increasingly important role in the effort to lock down the machines that run the world's major industrial systems. Finisterre is a white-hat hacker. He prods and probes computer systems, not to break into them, but to uncover important vulnerabilities. He then sells his expertise to companies that want to improve their security.
Curious about Competence
Two years ago, Finisterre, founder of security testing company Digital Munition, found himself swapping e-mails with a staffer at Idaho National Laboratory's Control Systems Security Program, a project funded by the U.S. Department of Homeland Security that is the first line of defense against a cyberattack on the nation's critical infrastructure.
Finisterre caught the attention of INL in 2008, when he released attack code that exploited a bug in the CitectSCADA software used to run industrial control environments. He'd heard about the INL program, which helps prepare vendors and plant operators for attacks on their systems, and he thought he'd drop them a line to find out how good they really were.
He was not impressed.
Is INL already working with the hacker community? Finisterre wanted to know. He received an off-putting response. The term "hacker" denotes a person of a "dubious or criminal nature" who would "not be hireable by a national laboratory," an INL staffer told him via e-mail.
"He basically lectured me about how INL doesn't interact with hackers and I should be very careful throwing that word around," Finisterre recalled. "I was like, 'Dude, I really hope you're joking, because you're supposed to be at the forefront of the research on this."
Call it an early skirmish in a culture clash between two worlds: the independent security researchers accustomed to dealing with tech firms such as Microsoft and Adobe, who have learned to embrace the hacker ethos, and the more conservative companies that develop and test industrial control systems, who often act like they wish these white-hat hackers would go away.
Earlier this year, Dillon Beresford, a security researcher at the consultancy NSSLabs, found a number of flaws in Siemens' programmable logic controllers. He had no complaints about the U.S. Department of Homeland Security's Industrial Control Systems Cyber Emergency Response Team, run out of INL. But he said Siemens did a disservice to its customers by downplaying the issues he'd uncovered. "I'm not pleased with their response," Beresford said earlier this year. "They didn't provide enough information to the public."
ICS-CERT was set up two years ago to handle the kind of bugs that Beresford and Finisterre are now finding with ease. The number of incidents funneled through ICS-CERT has increased six-fold in the past few years, from dozens of issues to hundreds, according to Marty Edwards, director of the Control Systems Security Program and the person in charge of ICS-CERT.
"The reason we're seeing such an increase is because, quite frankly, SCADA and industrial control systems [have become] cool," he said. "Things like Stuxnet have raised the attention level that industrial control systems and critical infrastructure systems are getting."
For many hackers, industrial systems are a new frontier in their technical explorations. For others, they're a throwback to the early days of hacking, before PCs became the primary target. Finisterre started out on the telephone system when he was growing up in the small town of Sidney, Ohio. "In the early '90s my mom thought I was messing with the phones at our house, but it turned out that someone was tampering with the phone switch remotely. I ultimately went on a quest to help my mom fight the phone company claims that 'Your son must be doing something to cause all these faulty charges,'" he said.
Nearly 20 years later, as a professional security researcher, he grew bored with the run-of-the-mill software bugs he was finding and turned to industrial systems. That's what led to his work finding holes in CitecSCADA. "It was like an instant transport back to my high school days," he said,
There are signs that he is not alone and that the floodgates are about to open. ICS-CERT is currently working on about 50 known issues, but two researchers from the commercial sector say they've found hundreds more, some perhaps unimportant, but others potentially serious. (See also "Dirty Little Secrets Revealed by Ethical Hackers.")
Billy Rios, a team lead in Google's security group, and Terry McCorkle, a member of the Information Security Red Team at Boeing, were having drinks together in February when they decided to take a close look at the type of industrial software Finisterre and others have been hacking. They wanted to see how many bugs they could find.
Working on their spare time, they downloaded as many industrial software packages as they could -- nearly 400 altogether, from Siemens, Rockwell Automation, Iconics and other vendors. All of them were freely available on the Internet. They set themselves a goal, to find 100 bugs in 100 days. But the pickings were so good they hit their target in three weeks. "We didn't even go through all the software we had, not even close,' McCorkle said.
In the end they found 665 issues in server software, driver packages and the Windows-based HMI (human-machine interface) software used to manage the machines on factory floors. Rios and McCorkle rate most of the bugs they've found as "non-critical," but they say about 75 of them could be used by criminals to damage an industrial system. "There's no single class of vulnerabilities that we nailed; it was just all over the board," Rios said.
"Anyone can do this, basically, if they just put the time into this and get an understanding of how this works," Rios added. "It's not like you'll find a bug here and there. It's just like if you put the time into it, it's pretty ridiculous what the results are."
Edwards, the man in charge of ICS-CERT, acknowledged that the group's workload has exploded since it was started in 2009. "We've seen a 600 percent increase in the number of vulnerabilities that have been coordinated and worked through the ICS-CERT," he said. The allure of industrial control systems means more researchers are now focusing on that area, he said.
The situation is reminiscent of what happened to Windows a decade ago, when hackers began picking apart Microsoft's products, McCorkle said. Industrial vendors are "basically just 10 years behind the curve on security. It's like we're going back to the '90s," he said.
When researchers first turned to Microsoft in the late 1990s, the software maker was caught flat-footed. It was only after several years of antagonism between Redmond and the hackers ripping apart its software that Microsoft figured out how to work with hackers.
Researchers became so tired of the issues they uncovered being ignored that they started to release the technical details in order to force Microsoft to release a patch. The idea of this pattern happening over again in industrial systems is worrying. It's an area where a security flaw could lead to a chemical spill or a widespread power blackout, and where it can take months to schedule and install patches.
Just this week, a researcher named Luigi Auriemma sent the ICS-CERT team scrambling when he published details on four new vulnerabilities in industrial products, something he'd already done several times in the past year. Auriemma, an independent researcher in Milan, believes posting technical details is the quickest way to get things fixed. "Full disclosure is the best way to get attention on this matter," he said in an instant-message interview.
One former INL staffer who worked at the Control Systems Security Program during the time Finisterre released his Citec code says that there were problems in the early days. "Industry has already had difficult interactions with the 'hacker' culture when these first few vulnerabilities for industrial control systems surfaced a few years ago," said Robert Huber, co-founder of Critical Intelligence, an Idaho company that does research into industrial systems threats. "Back then, the vendors were completely unprepared for these disclosures," he said in an e-mail interview.
But Huber thinks things are improving. "Many security researchers have worked with the vendors, or through an intermediary, to disclose vulnerabilities," he said. "Now, that said, the sheer number and interest may drive more researchers into the space to make a name for themselves without following the disclosure process, resulting in more vulnerabilities that are not coordinated.
"Only time will tell," he said.