Anti-virus roundup: Performance Analysis
Testing anti-virus software is a tricky endeavour at the best of times, due to the inherent nasty nature of the test subject. For this anti-virus roundup, we asked independent anti-virus testing lab, AV-Test.org (www.av-test.org) to conduct the testing for us in their controlled environments. AV-Test.org has access to thousands of viruses, and is a respected anti-virus testing lab.
All programs were tested on systems running Windows XP, with Service Pack 1 installed. Furthermore, all Microsoft patches and updates from www.windowsupdate.microsoft.com were applied to each test system. The latest versions of all programs were tested with the most up-to-date virus definition patterns installed, plus any patches.
The first part of the test was performed against self-replicating viruses and worms which can be found 'In the Wild' (ItW). These included viruses such as Melissa, Loveletter, MS-Blaster or Sobig to name only a few. The WildList Organisation (www.wildlist.org) collects reports from infected PCs worldwide to create such a list of common malware (malicious software). For this test, all viruses and worms from the current WildList 10/2003 were used - a total of 368 different malwares.
For every different virus or worm, exactly two infected files were included in the test collection, so the test set contained 736 infected files. All of these virus samples had been carefully tested to ensure their infectiousness and ability to replicate.
The anti-virus programs were tested repeatedly on these infected files, using both the scanner (which can be used to check files on-demand; click to scan system) and the on-access guard (which runs in the background and checks the files as they are accessed). To ensure high quality test results, two independent testers double-checked every product.
However, today the biggest troublemakers are not only viruses and worms, but also intentionally destructive (but not self-replicating) programs like Trojan horses and backdoors. Because there is no WildList for these malicious programs them, AV-Test.org tested all of the anti-virus programs against its full collection of them. At the time of the test, AV-Test.org tested against 11,349 different samples of backdoor programs as well as 14,288 Trojan horses.
As with the virus samples, only those Trojans that have a malicious payload and can be harmful to the user were tested. Files which cannot be classified as dangerous were excluded from the test set.
Likewise, only those backdoor files that can actually be used to hijack someone's PC were tested with. The several backdoor programs available on the internet which do not work were expunged from the test set. It should be noted that Av-Test.org tested both the 'client' and the 'server' parts of the backdoors in the collection. While the 'server' part is the most dangerous piece which will be installed on a PC to remotely control it, the 'client' part which the hacker is using is an unwanted file as well. Therefore, it only makes sense that a protection program should find both parts to be able to get rid of the malware completely.
Please note: While the amount of worms and viruses tested constitutes only a small number of those that have been catalogued (roughly 0.05 percent of the 70,000 known viruses and worms), it constitutes the majority of the malicious working code. Most of the early viruses were written for DOS and are no longer capable of infecting or propagating through Windows. Therefore testing against these viruses would be redundant.
We rated the performance of the anti-virus programs with weighted rankings for the software's ability to detect worms, viruses, Trojans and backdoors. Worms and viruses are the more dangerous, so a package's ability to detect these was weighted higher than those for backdoors or Trojans. Of the overall Performance score, worm and virus detection rated 35 percent each, and backdoor and Trojan horse detection 15 percent each.
When we looked at the anti-virus programs we took into account not just the ability to scan for and detect viruses, but also looked at the user interface and the program's ease-of-use. Anti-virus scanning can be a complicated enterprise, but it needn't be. A good interface with ease-of-use in mind can mean the difference between a user scanning their PC frequently or hardly at all.
With this in mind, we also rated the anti-virus packages from one to 10 for their user interface, and from one to 10 for their ease-of-use. The latter included the installation and configuration of the package, as well as how easy it was to use day-to-day.
These scores were then weighted - ease of use was given three times the prominence of the interface - to get our final Ease of Use score.
The third ranking is Value for Money, which was derived from the package's price, its Performance score and its Ease of Use score. There's a lot of variation in the value scores for this Labs, because two of the packages (AVG Anti-Virus and eTrust EZ Antivirus) are currently free, which puts the packages that cost nearly $100 into sharp relief. We normalised the pricing to partially lessen this discrepancy.
The fourth and final rating is the Overall score, which takes a weighted mean of the Performance, Ease of Use and Value for Money scores, with the program's performance taking 50 percent, ease of use and functionality 20 percent, and pricing 30 percent.
Good anti-virus software should detect at least 90 percent of malicious worms and virus programs. Higher detection rates are even better, but a piece of anti-virus software is only as good as the company behind it and how often it makes virus definition updates available. If a software tool detects less than 60 percent of the test set, it should be considered risky to rely on and shouldn't be used as the only malware protection software.
Thankfully, with the latest virus definition patterns installed, all of the software packages tested returned 100 percent detection rates when using both the on-demand scanner and the on-access guard.
This was not the case with the backdoor and Trojan horse tests, where the results varied wildly. Although arguably not as important as virus and worm detection, the ability to detect and deal with these two threat types is a significant capability. Only Kapersky Anti-Virus scored perfectly, with all viruses detected. McAfee VirusScan 8.0 lagged just behind by missing only 3.5 percent of the backdoor malware but picking up everything else. eTrust EZ Antivirus and Norman Virus Control languished at the bottom of the pack, with well under half the Trojans and backdoors detected. If you're worried about the hacker set, then these programs do not offer the level of protection you need.
AV-Test.org is a joint-research project of the Business-Information-Workgroup at the Institute of Technical and Business Information Systems at the Otto-von-Guericke University Magdeburg, Germany and AV-Test GmbH. About 15 professors and students currently work in the lab and regularly test anti-virus and other security applications for effectiveness.
Browse this article:
This Group Test appeared in the March, 2004 issue of PC & Tech Authority Magazine