Outsourcing incenses employee
"Sally," a systems administrator and a database manager, had been with a Fortune 500 consumer products company for 10 years and was one of its most trusted and capable IT workers, according to Larry Ponemon, founder and chairman of the Ponemon Institute, an IT security research firm.
She was known as a pinch-hitter -- someone who was able to help solve all kinds of problems. For that reason, she had accumulated many high-level network privileges that went beyond what her job required. "There is this tendency to give these people more privileges than they need because you never know when they'll need to be helping someone else out," says Ponemon.
She sometimes worked from home, taking her laptop, which was configured with those high-level privileges. The company's culture was such that IT stars like Sally were given special treatment, says Ponemon. "The IT people made an end run around certain policies," he says. "They could decide what tools they wanted on their systems."
But when the corporation decided to outsource most of its IT operations to India, Sally didn't feel so special. Although the company had not yet formally notified the IT staff, says Ponemon, it was obvious to IT insiders that time was running out for most of the department's employees.
Sally wanted revenge. Before she was officially let go, she planted logic bombs that caused entire racks of servers to crash once she was gone.
At first, the company had no clue what was going on. They switched to their redundant servers, but Sally had planted bombs in those as well. The company had a hard time containing the damage because it didn't follow any apparent rhyme or reason. "A malicious employee [who's] angry can do a lot of damage in a way that's hard to discover immediately and hard to trace later," Ponemon notes.
Eventually, they traced the sabotage to Sally and confronted her. In return for Sally's agreement to help fix the systems, the company did not prosecute her. In addition, Sally had to agree never to talk publicly about the incident. "They didn't want her going on Oprah and talking about how she broke the backbone of a Fortune 500 company."
Cost to the company
The estimated total cost to the company: $7 million, which includes $5 million in opportunity costs (downtime, disruption to business and potential loss of customers) and $2 million in fees for forensics and security consultants, among other things.
What did the company do wrong? First, the incident is a classic example of "privilege escalation," which is what happens when privileges are granted to an individual to handle a specific task but are not revoked when the person no longer needs them, says Ponemon.
Second, an entitlement culture led to no separation of duties and very little oversight of IT. Because of that, management missed an important red flag. After the incident, the company discovered that Sally had "lost" 11 laptops over the previous three years. The help desk staff was aware of this, but no one ever reported it to management, partly because of Sally's status in the organization. Nobody knows what she did with those laptops; it could be that she was just careless -- but "that's a problem in and of itself if you're a systems administrator," Ponemon observes.
Third, given the tense atmosphere created by the outsourcing decision, the company should have been more vigilant and more proactive in monitoring potentially angry employees.
Even if you haven't announced anything to your employees, it's a mistake to think they don't know what's going on, says Ponemon. "The average rank-and-file [worker] knows within a nanosecond of when the CEO signs the [outsourcing] contract," he says. If you aren't already monitoring your IT people, now is the time to start. For best results, kick off the program with a very public pronouncement that you are now monitoring the staff.
According to CERT, many cases of sabotage are the result of a disgruntled employee acting out of revenge. And those acts can happen in the blink of an eye, as the next story illustrates.
A firing gone very wrong
When this Fortune 100 company upgraded its security, it made a nasty discovery. One of its senior system admins, who had been there at least eight years, had surreptitiously added a page to the company's e-commerce Web site. If you typed in the company URL followed by a certain string of characters, you got to a page where this admin, whom we'll call "Phil," was doing a brisk business selling pirated satellite TV equipment, primarily from China, according to Jon Heimerl, director of strategic security for Solutionary, a managed security services provider hired to address the problem.
The good news: improved security caught the perpetrator. The bad news: A flawed firing procedure gave him the opportunity to take a parting shot.
Itself a retailer in high-tech equipment, the company wanted to get rid of Phil and the Web site as quickly as possible because it feared lawsuits from satellite equipment manufacturers. But while Phil's manager and security staffers were on their way to his office, a human resources representative called Phil and told him to stay put. Heimerl isn't sure exactly what the HR person said, but it was apparently enough for Phil to guess that the jig was up.
Already logged into the corporate network, he immediately deleted the corporate encryption key ring. "As he was hitting the delete key, security and his manager showed up and said, 'Stop what you're doing right now, and step away from the terminal,'" according to Heimerl. But it was too late.
The file held all the encryption keys for the company, including the escrow key, a master key that allows the company to decrypt any file of any employee. Most employees kept their own encryption keys on their local systems. However, the key ring held the only copies of encryption keys for about 25 employees -- most of whom worked in the legal and contracts departments -- and the only copy of the corporate encryption key. That meant that anything those employees had encrypted in the three years since they had started using the encryption system was permanently indecipherable -- and thus, virtually lost to them.
Cost to the company
Heimerl hasn't calculated how much money the incident cost the company, but he estimates the loss of the key ring file amounted to about 18 person-years of lost productivity, which takes into account both the work that went into creating files that are now permanently encrypted and the time devoted to re-creating materials from drafts, old e-mails and other unencrypted documents.
Focusing only on what happened after they discovered the rogue Web site, the company made two crucial mistakes, says Heimerl. It should have shut down Phil's access immediately upon discovering his activities. But managers also left themselves vulnerable by not keeping a secure backup of critical corporate information. (Ironically, the company thought the key ring was so sensitive that no copies should be made.)
The best defense is multipronged
The overall lesson from these horror stories is that no one single thing can protect you from rogue IT people. You might have great technical security -- like the multitiered security system that ultimately detected Phil's unauthorized Web site -- and yet a simple mistake by HR can lead to disaster. There could be big red flags in terms of behavior or personality that go unnoticed -- like Sally's missing laptops.
It's a combination of technical safeguards and human observation that offers the best protection, says CERT's Cappelli.
And yet it's hard to convince companies to do both. Executives tend to think such problems can be solved by technology alone, at least partly because they hear vendors of monitoring tools and other security-minded software claiming that their tools offer protection. "We're trying to figure out how to get the message to the C-level people that this is not just an IT problem," she says.
It's a difficult message to hear. And a lesson that many companies don't learn except the hard way. Even if more companies were forthcoming with the details of their horror stories, most CEOs would still think it could never happen to them. Until it does.
Frequent Computerworld contributor Tam Harbert is a Washington, D.C.-based writer specializing in technology, business and public policy. She can be contacted through her Web site, TamHarbert.com.
This story, "Security Fail: When Trusted IT People Go Bad" was originally published by Computerworld.