Anti‐Virus Comparative ‐ Performance Test (AV Products) ‐ November 2011 www.av‐comparatives.org
Anti-Virus Comparative
Performance test (AV Products)
Impact of Anti-Virus Software on System Performance
Language: English
November 2011
Last Revision: 8th December 2011
www.av-comparatives.org
Anti‐Virus Comparative ‐ Performance Test (AV Products) ‐ November 2011 www.av‐comparatives.org
‐ 2 ‐
Table of Contents
1. Introduction 3
2. Tested products 3
3. Test methods 4
4. Side notes and comments 5
5. Test cases 7
6. Test results 8
7. Award levels reached in this test 12
8. Copyright and Disclaimer 13
Anti‐Virus Comparative ‐ Performance Test (AV Products) ‐ November 2011 www.av‐comparatives.org
‐ 3 ‐
Introduction
We want to make clear that the results in this report are intended to give only an indication of the
impact on system performance (mainly by the real-time/on-access components) of the various Anti-
Virus products in these specific tests. Users are encouraged to try out the software on their own PC’s
and form an opinion based on their own observations.
Tested products
The following products were evaluated (with default settings) in this test (November 2011)1:
avast! Free Antivirus 6.0
AVG Anti-Virus 2012
AVIRA Free Antivirus 2012
Bitdefender Antivirus Plus 2012
eScan Anti-Virus 11
ESET NOD32 Antivirus 5
F-Secure Anti-Virus 2012
G DATA AntiVirus 2012
K7 Antivirus Plus 11.1
Kaspersky Anti-Virus 2012
McAfee AntiVirus Plus 2012
Microsoft Security Essentials 2.1
Panda Cloud Antivirus Free 1.5.1
PC Tools Spyware Doctor with Antivirus 2012
Qihoo 360 Antivirus 2.0
Sophos Endpoint Security 9.7
Symantec Norton AntiVirus 2012
Trend Micro Titanium Antivirus Plus 2012
TrustPort Antivirus 2012
Webroot SecureAnywhere Antivirus 2012
Please note that the results in this report apply only to the products/versions listed above (e.g. 64-
Bit versions, product version, etc.). Also, keep in mind that different vendors offer different (and
differing quantities of) features in their products.
The following activities/tests were performed under Windows 7 Professional SP1 64-Bit:
File copying
Archiving / Unarchiving
Encoding / Transcoding
Installing / Uninstalling applications
Launching applications
Downloading files
PC Mark 7 Professional Testing Suite
1 We used the latest available product versions available at time of testing.
Anti‐Virus Comparative ‐ Performance Test (AV Products) ‐ November 2011 www.av‐comparatives.org
‐ 4 ‐
Test methods
The tests were performed on an Intel Core i5 2.67 GHz machine with 4GB of RAM and SATAII hard
disks. The performance tests were done on a clean Windows 7 Professional SP1 64 Bit system (English)
and then with the installed Anti-Virus software (with default settings). The tests have been done with
an active internet connection to simulate real world impact of cloud services/features.
The hard disks were defragmented before starting the various tests, and care was taken to minimize
other factors that could influence the measurements and/or comparability of the systems. Optimizing
processes/fingerprinting used by the products were also considered – this means that the results rep-
resent the impact on a system which has already been used by the user for a while. The tests were
repeated several times (with and without fingerprinting) in order to get mean values and filter out
measurement errors. After each run the workstation was defragmented and rebooted. We simulated
various file operations that a computer user would execute: copying2 different types of clean files
from one place to another, archiving and unarchiving files, encoding and transcoding3 audio and vid-
eo files, converting DVD-Files to iPod format, downloading files from Internet, launching applications,
etc. We also used a third-party industry recognized performance testing suite (PC Mark 7 Professional)
to measure the system impact during real-world product usage. Readers are invited to evaluate the
various products themselves, to see how they impact on their systems (such as software conflicts
and/or user preferences, as well as different system configurations that may lead to varying results).
Security products need to load on systems at an early stage to provide security from the very begin-
ning – this load has some impact on the time needed for a system to start up. Measuring boot times
accurately is challenging. The most significant issue is to define exactly when the system is fully
started, as many operating environments may continue to perform start-up activities for some time
after the system appears responsive to the user. It is also important to consider when the protection
provided by the security solution being tested is fully active, as this could be a useful measure of
boot completion as far as the security solution is concerned. Some Anti-Virus products are loading
their services very late (even minutes later) at boot (users may notice that after some time that the
system loaded, the system gets very slow for some moments), so the system looks like loading very
fast, but it just loads its services later and makes the system also insecure/vulnerable. As we do not
want to support such activities, we still do not measure boot times.
2 We used 4GB data of various file categories (pictures, movies, audio files, various MS Office documents, PDF
files, applications/executables, Microsoft Windows 7 system files, archives, etc.).
3 Converting MP3 files to WAV, MP3 to WMA, AVI to MPG and MPG to AVI, as well as iPod format
Anti‐Virus Comparative ‐ Performance Test (AV Products) ‐ November 2011 www.av‐comparatives.org
‐ 5 ‐
Side notes and comments
The on-access/real-time scanner component of Anti-Virus software runs as a background process to
check all files that are accessed, in order to protect the system continuously against malware threats.
For example, on-access scanners scan files as soon as they are accessed, while (e.g.) behaviour-
blockers add a different layer of protection and monitor what the file does when it is already execut-
ed/running. The services and processes that run in the background to do these tasks also require and
use system resources. Suite products have usually a higher impact on system performance than Anti-
Virus-only products, as more services/features are included and running in the background.
Security products need to be active deep in the system in order to protect it and e.g. to scan process-
es and so on that are already active during the system start-up, to identify rootkits and other mal-
ware. Those procedures add some extra time and thus a delay in system boot/start up.
If a product takes up too many system resources, users get annoyed and may either disable or unin-
stall some essential protective features (and considerably compromise the security of their system) or
may switch to security software that is less resource-hungry. Therefore, it is important not only that
Anti-Virus software provides high detection rates and good protection against malware, but also that
it does not degrade system performance or trouble users.
While this report looks at how much impact various Internet Security products have on system perfor-
mance, it is not always just the security software which is the main factor responsible for a slow sys-
tem. Other factors also play a role, and if users follow some simple rules, system performance can be
improved noticeably. The next sections address some of the other factors that may play a part.
A few common problems observed on some user PCs:
- Old hardware: If a PC already runs at a snail’s pace because it has ten-year-old hardware, us-
ing modern (security) software may make it unusable.
o If possible, buy a new PC that at least meets the minimum recommended requirements of
the software you want to use. Multi-Core processors are preferable.
o Adding more RAM does not hurt. If you use Windows XP or Windows 7, you should use a
minimum of 2GB of RAM. If you use Vista switch to Windows 7. 64-Bit systems are prefer-
able, as especially software which is optimized for such systems will run faster.
o Make sure you have only ONE Anti-Virus program with real-time protection. If your new PC
came with a trial Anti-Virus program, remove this before installing a different AV program.
- Keep all your software up-to-date: Using an Anti-Virus version from e.g. 2009 does not pro-
tect you as well as the newer version would, even though you may still be able to update the
signatures. Please visit http://update.microsoft.com regularly and keep your operating system
up-to-date by installing the recommended patches. Any software can have vulnerabilities and
bugs, so keep all the software installed on your PC up-to-date: this will not only protect you
against many exploits and vulnerabilities, but also give you any other application improve-
ments that have been introduced.
Anti‐Virus Comparative ‐ Performance Test (AV Products) ‐ November 2011 www.av‐comparatives.org
‐ 6 ‐
- Clean up the content of your hard disk:
o If your hard disk is almost full, your system performance will suffer accordingly. Leave at
least 20% of your disk space free and move your movies and other infrequently accessed
files to another (external) disk. If money is not an issue, consider buying solid state
drives (SSDs).
o Uninstall unneeded software. Often, the slowdown that users notice after installing an
Anti-Virus product is due to other software on the PC running in the background (that is,
due to software conflicts or heavy file access by other programs, each access requiring an-
ti-virus scanning).
o Remove unneeded entries/shortcuts from the Autostart/start-up folder in the program
menu
o if your PC is already messed up by residual files and registry entries left over by hundreds
of applications you installed and uninstalled after trying them out over the past years, re-
install a clean operating system and install only software you really need (fewer software
installations, fewer potential vulnerabilities and conflicts, and so on) and use e.g. an im-
age/backup tool in order to ensure that you do not have to reinstall everything manually
in future.
- Defragment your hard disks regularly: A fragmented hard disk can have a very big impact on
system performance as well as considerably increasing the time needed to boot up the system.
- Fingerprinting/Optimization: most Anti-Virus products use various technologies to decrease
their impact on system performance. Fingerprinting is such a technology, where already scanned
files do not get rescanned again for a while (or more rarely) or are whitelisted. This increases the
speed considerably (esp. after some time the PC was used), but also adds some little potential
risk, as not all files are scanned anymore. It is up to the user to decide what to prefer. We sug-
gest performing regularly a full-system scan (to be sure that all files are at least currently found
as clean and to further optimize the fingerprinting).
- Be patient: a delay of a few additional seconds due to Anti-Virus is not necessarily a big deal.
However, if even with the suggestions above the performance of your PC still annoys you, for in-
stance, after you have installed the Anti-Virus you should consider trying out another Anti-Virus
product. (If you only notice a slow-down after using the Anti-Virus for a long time, there are
probably other factors behind the slowdown). Never reduce your security by disabling essential
protection features, just in the hope of gaining a slightly faster PC!
Anti‐Virus Comparative ‐ Performance Test (AV Products) ‐ November 2011 www.av‐comparatives.org
‐ 7 ‐
Test cases
File copying
Some Anti-Virus products do not scan all kind of files by design/default (e.g. based on their file
extensions), or use fingerprinting technologies, which may skip already scanned files in order to
increase the speed (see comments on page 6). We copied a set of different file types which are wide-
spread at home and office workstations from one physical hard disk to another physical hard disk.
Archiving and unarchiving
Archives are commonly used for file storage, and the impact of Anti-Virus software on the time taken
to create new archives or to unarchive files from existing archives may be of interest for most users.
We archived a set of different file types which are widespread at home and office workstations form
one physical hard disk to another physical hard disk and unzipped them after this again on a third
physical hard disk. The results already consider the fingerprinting/optimization technologies of the
Anti-Virus products, as most users usually make archives of files they have on their disk.
Encoding/transcoding
Music files are often stored and converted on home systems, and converting such files takes system
resources. Due that, many home users may be interested to know if their Anti-Virus product imposes a
slowdown while converting multimedia files from one format to another. We encoded and transcoded
some multimedia files with FFmpeg, and for the iPod conversion we used HandBrakeCLI. The impact
during FFmpeg and iPod converting was almost the same.
Installing/uninstalling applications
We installed several programs (like e.g. Visual C++, .NET Framework, etc.) with MSI installers, and
then uninstalled them and measured how long it took. We did not consider fingerprinting, because
usually an application is only installed once.
Launching applications
Office document files and PDF files are very common. We opened some large document files in Mi-
crosoft Office (and closed it) and some large PDF files in Adobe Acrobat Reader (and closed it). Before
each opening, the workstation was rebooted. The time taken for the viewer or editor application to
open and a document to be displayed was measured. Although we list the results for the first opening
and the subsequent openings, we consider the subsequent openings more important, as normally this
operation is done several times by users, and optimization features of the Anti-Virus products take
place, minimizing their impact on the systems.
Downloading files
Files are commonly downloaded from the internet. All products were “very fast” in this test.
Anti‐Virus Comparative ‐ Performance Test (AV Products) ‐ November 2011 www.av‐comparatives.org
‐ 8 ‐
Test results
These specific test results show the impact on system performance that Anti-Virus products have,
compared to the other tested Anti-Virus products. The reported data just give an indication and are
not necessarily applicable in all circumstances, as too many factors can play an additional part. As we
noticed that delivering percentages gets easily misinterpreted by users (as well as misused by market-
ing departments or the press) and percentages would need adjustments when other hardware specifi-
cations are being used, we grouped the results by clustering them. The impact within those categories
does not statistically differ, also considering error measurements. The categories were defined by the
testers by consulting statistical methods like hierarchal clustering and taking into consideration what
would be noticed from user’s perspective or compared to the impact of the other security products.
Anti‐Virus Comparative ‐ Performance Test (AV Products) ‐ November 2011 www.av‐comparatives.org
‐ 9 ‐
Overview of single AV-C performance scores
Vendor File copying Archiving/
unarchiving
Encoding/
transcoding
Installing/
uninstalling
applications
Launching applications
Downloading
files
Open Word Open PDF
On first run On subsequent runs On first run On subsequent runs On first run On subsequent runs
Avast
AVG
AVIRA
Bitdefender
eScan
ESET
F-Secure
G DATA
K7
Kaspersky
McAfee
Microsoft
Panda
PC Tools
Qihoo
Sophos
Symantec
Trend Micro
Trustport
Webroot
Key:
slow mediocre fast very fast
Anti‐Virus Comparative ‐ Performance Test (AV Products) ‐ November 2011 www.av‐comparatives.org
‐ 10 ‐
PC Mark Tests
In order to provide an industry-recognized performance test, we used the PC Mark 7 Professional Edi-
tion4 testing suite. Users using PC Mark 7 should take care to minimize all external factors which
could affect the testing suite and follow strictly at least the considerations/suggestions documented
inside the PC Mark manual, in order to get consistent and valid/useful results. Furthermore, the tests
should be repeated several times to verify them. For more information about the various consumer
scenarios tests included in PC Mark, please read the whitepaper on their website5.
“Without AV” is tested on a baseline6 system which scores 2024 in the PC Mark test.
PC Mark score Points
without AV 2024 ‐
K7 2020 99,8
ESET 2019 99,8
Symantec 2018 99,7
Avast 2017 99,7
F-Secure 2015 99,6
Microsoft 2014 99,5
eScan 2008 99,2
Sophos 2007 99,2
AVIRA 2006 99,1
Kaspersky 2001 98,9
AVG 1996 98,6
Panda 1993 98,5
Webroot 1987 98,2
Qihoo 1985 98,1
G DATA 1984 98,0
Bitdefender 1982 97,9
McAfee 1975 97,6
TrendMicro 1972 97,4
Trustport 1972 97,4
PC Tools 1908 94,3
4 For more information, see http://www.pcmark.com/benchmarks/pcmark7/
5 http://www.pcmark.com/wp-content/uploads/2011/05/PCMark7_Whitepaper.pdf (PDF)
6 Baseline system: Intel Core i5 (2.67 GHz) machine with 4GB of RAM, ATI Radeon HD4500 (512 MB)
Anti‐Virus Comparative ‐ Performance Test (AV Products) ‐ November 2011 www.av‐comparatives.org
‐ 11 ‐
Summarized results
Users should weight the various subtests according to their needs. We applied a scoring system in
order to sum up the various results.
For “file copying” we took the mean values, as well as for “launching applications” (on subsequent
runs). Like in previous performance reports, “very fast” gets 15 points, “fast” gets 10 points, “medio-
cre” gets 5 points and “slow” gets zero points. This leads to the following results:
AV-C Score PC Mark Score TOTAL
ESET 90 99,8 189,8
K7 90 99,8 189,8
Avast 90 99,7 189,7
Symantec 90 99,7 189,7
F-Secure 90 99,6 189,6
AVIRA 90 99,1 189,1
Kaspersky 90 98,9 188,9
AVG 90 98,6 188,6
Microsoft 88 99,5 187,5
Sophos 88 99,2 187,2
Panda 88 98,5 186,5
Webroot 88 98,2 186,2
eScan 78 99,2 177,2
McAfee 75 97,6 172,6
Trend Micro 75 97,4 172,4
G DATA 70 98 168,0
Bitdefender 70 97,9 167,9
Qihoo 63 98,1 161,1
Trustport 60 97,4 157,4
PC Tools 60 94,3 154,3
Anti‐Virus Comparative ‐ Performance Test (AV Products) ‐ November 2011 www.av‐comparatives.org
‐ 12 ‐
Award levels reached in this test
The following award levels are for the results reached in this performance test report. Please note that
the performance test only tells you how much impact an Anti-Virus product may have on a system
compared to other Anti-Virus products; it does not tell you anything about the effectiveness of the
protection a product provides.
AWARDS PRODUCTS7
ESET
K7
Avast
Symantec
F-Secure
AVIRA
Kaspersky
AVG
Microsoft
Sophos
Panda
Webroot
eScan
McAfee
Trend Micro
G DATA
Bitdefender
Qihoo
Trustport
PC Tools
The above awards have been given based on our assessment of the overall impact results with default
settings under Windows 7 Professional SP1 64 Bit.
7 We suggest considering products with th
本文档为【performance_nov_2011】,请使用软件OFFICE或WPS软件打开。作品中的文字与图均可以修改和编辑,
图片更改请在作品中右键图片并更换,文字修改请直接点击文字进行修改,也可以新增和删除文档中的内容。
该文档来自用户分享,如有侵权行为请发邮件ishare@vip.sina.com联系网站客服,我们会及时删除。
[版权声明] 本站所有资料为用户分享产生,若发现您的权利被侵害,请联系客服邮件isharekefu@iask.cn,我们尽快处理。
本作品所展示的图片、画像、字体、音乐的版权可能需版权方额外授权,请谨慎使用。
网站提供的党政主题相关内容(国旗、国徽、党徽..)目的在于配合国家政策宣传,仅限个人学习分享使用,禁止用于任何广告和商用目的。