Security Fail: When Trusted IT People Go Bad
It's a CIO's worst nightmare: You get a call from the Business Software Alliance (BSA), saying that some of the Microsoft software your company uses might be pirated.
You investigate and find that not only is your software illegal, it was sold to you by a company secretly owned and operated by none other than your own IT systems administrator, a trusted employee for seven years. When you start digging into the admin's activities, you find a for-pay porn Web site he's been running on one of your corporate servers. Then you find that he's downloaded 400 customer credit card numbers from your e-commerce server.
And here's the worst part: He's the only one with the administrative passwords.
Think it can't happen? It did, according to a security consultant who was called in to help the victim, a $250 million retailer in Pennsylvania. You never heard about it because the company kept it quiet.
Despite the occasional headlines about IT folks gone rogue (remember Terry Childs, the network administrator who held the city of San Francisco's network hostage?), most companies sweep such situations under the rug as quickly and as quietly as possible.
An annual survey by CSO magazine, the U.S. Secret Service and CERT (a program of the Software Engineering Institute at Carnegie Mellon University) routinely finds that three quarters of companies that are victimized by insiders handle the matter internally, says Dawn Cappelli, technical manager of CERT's threat and incident management team. "So we know that [what's made public] is only the tip of the iceberg," she says.
By keeping things quiet, however, victimized companies deny others the opportunity to learn from their experiences. CERT has tried to fill that void. It has studied insider threats since 2001, collecting information on more than 400 cases. In its most recent report, 2009's "Common Sense Guide to Prevention and Detection of Insider Threats" (download PDF), which analyzes more than 250 cases, CERT identifies some of the most common mistakes companies make: inadequate vetting during the hiring process, inadequate oversight and monitoring of access privileges and overlooking of red flags in behavior.
But threats from privilege-laden IT employees are especially hard to detect. For one thing, staffers' nefarious activities can look the same as their regular duties. IT employees routinely "edit and write scripts, edit code and write programs, so it doesn't look like anomalous activity," Cappelli says. And they know where your security is weakest and how to cover their tracks. You can't rely on technology, or any single precaution to protect yourself from rogue IT people. You have to look at the big picture.
"It requires not only looking at what they are doing online but also what's happening in the workplace," says Cappelli. "People really need to understand the patterns here, the story behind the numbers."
Computerworld went looking for some of those stories behind the numbers, incidents that have not been widely reported. Though the victimized companies wouldn't talk, the security consultants who helped clean up the messes would. Although each story has unique circumstances, together they show some of the typical patterns that CERT emphasizes. Employer, beware.
Pirating software -- and worse
The Pennsylvania retailer's tale of woe began in early 2008, when the BSA notified it that Microsoft had uncovered licensing discrepancies, according to John Linkous. Today, Linkous is chief security and compliance officer at eIQ Networks, a security consultancy. His experience with the incident involving the retailer is from his previous job, when he was vice president of operations at Sabera, a now-defunct security consultancy.
Microsoft had traced the sale of the suspect software to a client company's sysadmin. For purposes of this story, we'll call that sysadmin "Ed." When Linkous and other members of the Sabera team were secretly called in to investigate, they found that Ed had sold more than a half-million dollars in pirated Microsoft, Adobe and SAP software to his employer.
The investigators also noticed that network bandwidth use was abnormally high. "We thought there was some kind of network-based attack going on," says Linkous. They traced the activity to a server with more than 50,000 pornographic still images and more than 2,500 videos, according to Linkous.
In addition, a forensic search of Ed's workstation uncovered a spreadsheet containing hundreds of valid credit card numbers from the company's e-commerce site. While there was no indication that the numbers had been used, the fact that this information was contained in a spreadsheet implied that Ed was contemplating either using the card data himself or selling it to a third party, according to Linkous.
The CFO, who had originally received the call from the BSA, and others on the senior management team feared what Ed might do when confronted. He was the only one who had certain administrative passwords -- including passwords for the core network router/firewall, network switches, the corporate VPN, the HR system, the e-mail server administration, Windows Active Directory administration, and Windows desktop administration.
That meant that Ed could have held hostage nearly all the company's major business processes, including the corporate Web site, e-mail, financial reporting system and payroll. "This guy had keys to the kingdom," says Linkous.
So the company and Linkous' firm launched an operation right out of Mission: Impossible. They invented a ruse that required Ed to fly overnight to California. The long flight gave Linkous' team a window of about five and a half hours during which Ed couldn't possibly access the system. Working as fast as they could, the team mapped out the network and reset all the passwords. When Ed landed in California, "the COO was there to meet him. He was fired on the spot."
Cost to the company
Linkous estimates that the incident cost the company a total of $250,000 to $300,000, which includes Sabera's fee, the cost of flying Ed to the West Coast on short notice, the cost of litigation against Ed, the costs associated with hiring a temporary network administrator and a new CIO, and the cost of making all of its software licenses legitimate.
What could have prevented this disaster? Obviously, at least one other person should have known the passwords. But more significant was the lack of separation of duties. The retailer had a small IT staff (just six employees), so Ed was entrusted with both administrative and security responsibilities. That meant he was monitoring himself.
Separating duties can be a particularly tough challenge for companies with small IT staffs, Linkous acknowledges. He suggests small companies monitor everything, including logs, network traffic and system configuration changes, and have the results evaluated by someone other than the system administrator and his or her direct reports. Most important, he says, is to let IT people know that they are being watched.
Second, the company failed to do a thorough background check when it hired Ed. In CERT's research, 30% of the insiders who committed IT sabotage had a previous arrest history. In fact, any kind of false credentials should raise a red flag. Although the company had done a criminal background check on Ed (which was clean), it did not verify the credentials on his résumé, some of which were later found to be fraudulent. (He did not, for example, have the MBA that he claimed to have.)
Third, Ed's personality could have been viewed as a red flag. "He seemed to believe that he was smarter than everyone else in the room," says Linkous, who met Ed face-to-face by posing as an ERP vendor before the sting operation. Ed's arrogance reminded Linkous of the infamous Enron executives. "He was extremely confident, cocky and very dismissive of other people."
CERT has found that rogues often have prickly personalities. "We don't have any cases where, after the fact, people said, 'I can't believe it -- he was such a nice guy,'" says Cappelli.