AV Testing Guidelines Coming Next Year
Consumers should have more accurate information with which to compare security software suites under a new set of software testing guidelines to be finalized early next year.
Last week, security vendors and software testing organizations agreed during a conference in Seoul to form the Anti-Malware Testing Working Group, which will determine how best to conduct behavioral tests of security software, said Andreas Marx, who works for AV-Test.org, a German antivirus software testing group.
Behavioral tests are time-consuming but important since the style of test replicates how PCs encounter malicious software on the Internet, such as through Trojan horse programs in e-mail attachments or through browser exploits, Marx said.
Those tests are seen as superior to signature-based tests, in which the virus detection engine is run against a batch of thousands of malware samples. But signature tests do not cover other security technologies used to detect a threat, such as if a new program starts communicating with a remote server over the Internet.
"A very big part of the real world is missing," Marx said. "Most products are tested against a set of outdated viruses. In most cases, most AV products will pass these tests."
The cooperation between security vendors is notable, especially in the highly competitive security software industry. Vendors Panda, F-Secure, Sunbelt Software and Symantec are participating as well as AV-Test.org and Virus Bulletin, another testing organization based in Abingdon, England.
Marx has written a draft of a behavioral testing scheme. Early next year, the Anti-Malware Testing Working Group will refine those guidelines for use by groups such as AV-Test.org and Virus Bulletin. Use of the guidelines, however, will be voluntary.
Most vendors feel the new behavioral tests will more fully evaluate the different ways their products can provide protection to a PC. Security companies have often publicly argued over signature-based tests, with disputes centering around the age of the virus samples used.
Companies that fail a signature test often argue that the particular sample that caused them to fail was too old and not even commonly found on the Internet. Some security vendors will remove signatures in their products for older malicious software so PCs are not burdened with large signature databases.
The Anti-Malware Testing Working Group will also provide an unbiased forum for those disputes. Today, "there is little recourse," if a vendor fails a test and has an issue with the test's parameters, said Mark Kennedy, an antivirus engineer with Symantec.
There is concern, however, that the behavioral tests may put too much a strain on testing groups, Marx said. Setting up real-world malicious software scenarios takes a lot more time, Marx said. Usually, a behavioral test is limited to around 50 current malware samples, he said.
However, testing organizations are in early discussions that could result in some cooperation in order to reduce that burden, he said.
The results shows that none of the eight products tested performed very well in behavioral tests due to the increasing sophistication of malware. Security companies have said their labs are having a hard time keeping up with the startling increase in the quantity of malware circulating on the Internet.
Marx said his lab alone receives between 2,000 and 2,500 different samples of malicious software per hour.