The New Virus Fighters

How We Tested Antivirus Software

We tested stand-alone antivirus products where possible and only the antivirus scanning engines of products that had multiple security components. The PC World Rating is a weighted average of specifications (10 percent), price (10 percent), design (30 percent), and performance (50 percent).

Performance Tests Explained

AV-Test, a German security firm, evaluated how well the programs could detect 1518 WildList threats, and 136,250 threats from its own zoo of backdoors programs, bots, and Trojan horses. AV-Test evaluated each program's heuristics by using one-month-old and two-month old versions of the programs, which wouldn't have the benefits of subsequent malware signatures. In the one-month-old heuristic tests, AV-Test saw how well the programs could detect 244 backdoor programs and 37 worms. In the two-month-old heuristic tests, AV-Test saw how well the programs could detect 555 backdoor programs and 101 worms.

AV-Test evaluated how well the programs could detect and clean 110 macro viruses affecting Microsoft Office applications. AV-Test also compiled data on how quickly software companies released virus signatures for 16 new outbreaks over a period of eight months in 2005. PC World tested how quickly each program ran a system scan on a test set of files and folders.

Performance results are a weighted average of WildList tests (30 percent), zoo tests (15 percent), one-month-old heuristic tests (20 percent), two-month-old heuristic tests (10 percent), macro virus results (10 percent), outbreak-response-time tests (10 percent), and scan-speed tests (5 percent).

Tony Bradley is a network security consultant, and the lead writer for About.com's Internet/Network Security Web site. Narasu Rebbapragada is an associate editor for PC World.

Subscribe to the Security Watch Newsletter

Comments