Twice a year, independent antivirus testing lab AV-Comparatives.org releases a report comparing how well 15 to 20 antivirus products can detect malware in an on-demand scan. In the latest report, quite a few products improved their ratings, though a couple cloud-based tools failed to complete the test. AV-Comparatives also switched to a new method of rating false positives.
Many Improved Ratings
Every product that passes the on-demand scan test rates at least STANDARD. Those that do a bit better than the rest rate ADVANCED, and the very best products rate ADVANCED+. If a product doesn’t pass, it receives the rating TESTED.
Four products that rated ADVANCED in February’s test moved up to ADVANCED+. They are: avast! Free version 6.0, ESET NOD32 Antivirus 5, G Data AntiVirus 2012, and Panda Cloud Anti-Virus 1.5.
AVG Anti-Virus Free 2012 moved up from STANDARD to ADVANCED, and Trend Micro Titanium Antivirus+ 2012 leapt all the way from STANDARD to ADVANCED+.
A Few Going Down
A few products slipped in the ratings. TrustPort Antivirus 2012 and eScan Anti-Virus 11 slipped from ADVANCED+ down to ADVANCED. PC Tools Spyware Doctor with AV 8.0 didn’t pass this time, going from STANDARD to merely TESTED.
Problems in the Cloud
Webroot AntiVirus with Spy Sweeper relies on cloud-based protection supplied by Sophos for enhanced malware detection. During this particular test, the cloud-based backend didn’t work properly, so AV-Comparatives could only estimate scores for Sophos and Webroot. Due to this problem these two products weren’t assigned a rating.
The report states: “The cloud should be considered as an additional benefit/feature to increase detection rates… and not as a full replacement for local offline detection. Vendors should make sure that users are warned in case the connectivity to the cloud gets lost… [This] may affect considerably the provided protection and make… the scan useless.”
Note that the Webroot product tested will be replaced by a completely new product next week. The new Webroot SecureAnywhere Antivirus relies almost 100 percent on cloud-based detection.
Problems with Vendors
The report also noted that some vendors don’t react well to low ratings. “We observed some few vendors potentially are trying to game the tests to get higher scores… Some try disputing every malicious files which are not detected by their own product as ‘unimportant/non-prevalent,’ even if other telemetry shows otherwise… Furthermore, some vendors which see themselves scoring low in a test often aim to get their results removed from a test for marketing reasons. But we do not allow to withdraw from tests as we want to provide results to our readers.”
False Positives Ranked by Prevalence
A false positive occurs when an antivirus tool identifies a good, valid file as malicious. Some testers are very strict about FPs. Virus Bulletin, for example, withholds its VB100 award if a product generates even one FP.
The Anti-Malware Testing Standards Organization (AMTSO) recommends that testers look at the significance of false positives. Erroneously deleting file that’s critical to system operation is more significant than deleting a non-critical application. Deleting a file found on hundreds of thousands of PCs is more significant than deleting a program with just a few users. In addition to criticality and prevalence, the AMTSO document points out that recoverability is an issue.
In this latest test AV-Comparatives has switched from simply counting FPs to rating them by the prevalence of the file involved. McAfee AntiVirus Plus 2011 had no FPs at all. Microsoft Security Essentials 2.0 had just one, at the lowest prevalence (under a hundred users). Kaspersky Anti-Virus 2012 also had just one, but this was a file estimated to have several thousand users.
Near the other end of the scale, Norton had 57 false positives. However, most of these were at the lowest prevalence, a few at the second lowest (hundreds of users), and just one of actual significance (tens of thousands of users). Norton’s File Insight analysis is deliberately designed to strongly suspect very uncommon files of being malicious, so this result isn’t surprising.
You can view the full report at the AV-Comparatives Web site. The document detailing false-positive testing lists every single FP file for every vendor, along with its prevalence.
For the top stories in tech, follow us on Twitter at @PCMag.
Article source: http://www.pcmag.com/article2/0,2817,2393736,00.asp?kc=PCRSS03069TX1K0001121
View full post on National Cyber Security » Virus/Malware/Worms