The VB100 award was first introduced in 1998. In order to display the VB100 logo, an anti-virus product must have demonstrated in our tests that:
* It detects all In the Wild viruses during both on-demand and on-access scanning.
* It generates no false positives when scanning a set of clean files.
E per chi avesse voglia di leggerlo:
VB100 (virus) test procedures
The VB100 (virus) award is granted to any product that passes the test criteria under test conditions in the VB lab as part of the formal VB comparative review process.
The basic requirements are that a product detects, both on demand and on access, in its default settings, all malware known to be 'In the Wild' at the time of the review, and generates no false positives when scanning a set of clean files.
Various other tests are also carried out as part of the comparative review process, including speed measurements, and results and conclusions are included with the review. The results of these secondary tests do not affect a product's qualification for VB100 certification.
Product submissions
Products must be submitted for review by their developers. Deadlines for product submission, along with details of platforms and the WildLists from which the test sets will be compiled, will be announced in advance of the test.
Submissions must be received by VB by the deadline set, with all required components including updates to virus definitions and detection engines. All software must be made available ready to install, update and run in an 'offline' situation, as all testing is performed in a completely sealed environment without access to any external network.
Submissions are accepted in the form of web downloads (the preferred format), email attachments or hard copies sent by post or courier, as long as they arrive before the deadline.
Testing and certification is open to any product vendor, entirely free of charge. By submitting a product for testing vendors agree to have their product reviewed and analysed as VB sees fit, within the scope of the testing methodology presented below. Once a product has been accepted for testing, it may not be withdrawn from the review by the vendor. However, VB reserves the right to refuse to test any product without further explanation.
Award criteria
The requirements for VB100 (virus) certification are:
*
100% detection of malware listed as 'In the Wild' by the WildList Organization.
The WildList to be used for each test will be the latest available at the time of the test deadline. This deadline will be communicated to potential participants, and publicised on the VB website approximately two weeks prior to the submission deadline for each test. All samples in the WildList collection are verified and replicated from originals provided by the WildList Organization.
'Detection' in this case is accepted if the product clearly marks a file as infected in its log or on-screen display, or denies access to it during on-access testing. If such logging or blocking is unavailable or deemed unusable for test purposes, deletion or attempted disinfection of samples will also be an accepted indicator of detection.
*
No false positives when scanning VB's collection of known-clean files.
The collection of known-clean files includes the test sets used for speed measurements, and is subject to regular and unannounced updating and enlargement. A false positive will be counted if a product is considered to have flagged a file clearly as infected in its log or on-screen display.
A false positive will not be recorded if a file is labelled as something other than malware, such as adware, or a legitimate item of software with potentially dangerous uses. All other alerts on clean files will be counted as false positives.
Flags will be adjudged to mark either a detection, in which case any files marked thus will be counted as detections in the infected set or as false positives in the clean sets, or mere suspicion, in which case no detection or false positive will be recorded. There will be no overlap between the two.
All tests will be performed both on demand and on access. Any failure to detect a sample from the WildList set, in either mode, or a false positive alert in either mode, will result in a product failing to qualify for the VB100 award.
Other tests
Scanning results of several other collections of malware (zoo collections) will be included in comparative testing reports, including analysis of these results in the form of tables, graphs etc. These collections are subject to regular enlargement, updating and reorganisation without advance notice. Detection rates for these collections will be measured in the same manner as for the WildList test set. Contents of all infected collections will be listed on the VB website after test results are published.
Measurements of scanning speed and on-access overhead will also be taken and published as part of the review. Methods used to gather, analyse and present speeds are subject to modification dependent on the requirements of individual platforms and products.
Default settings
A product's default installation settings will be used for all tests, with the following exceptions:
* Adjustments may be made to logging settings to allow adequate information to be gathered to analyse test results.
* On-access scanning will be disabled, where possible, during on-demand testing.
* In the event of a sample appearing to be missed due to a file type not being scanned by default (a common occurrence with, for example, archives not being scanned on access by some products), such samples may be rescanned with altered settings to verify this, in order to inform VB readers of the cause of such misses. Some adjustments to the file types scanned may be made during speed testing, in order to present more informative comparative data. Any false positive raised as a result of such alterations to default settings will not count against a product for certification, but may be recorded in the review text.
Three chances
Should the reviewer be unable to make a product function adequately, either wholly or in part, or should any event occur which appears to be the result of a problem with the installation and operation of a product, tests may be repeated up to a maximum of three times, on two different test machines using clean images for each attempt.
Review text
Each product submitted for testing will be described to some extent in the text of the comparative review published in Virus Bulletin magazine and on www.virusbtn.com, with attention paid to design, usability, features and other criteria considered by the reviewer to be of interest. For the purposes of this analysis, product settings may be adjusted and additional testing carried out at the discretion of the reviewer. Any comments thus made are the opinion of the reviewer at the time of the review.
Right to reply
Should any vendor have any queries concerning the results of the tests, they are encouraged to contact VB for clarification and further analysis where necessary (email
[email protected]).
VB100 award
A VB100 award means that a product has passed our tests, no more and no less. The failure to attain a VB100 award is not a declaration that a product cannot provide adequate protection in the real world if administered by a professional. VB urges any potential customer, when looking at the VB100 record of any software, not simply to consider passes and fails, but to read the small print in the reviews. :stordita: