PDA

View Full Version : Virus Bulletin Giugno 2008


epa
09-06-2008, 16:14
Non so se la notizia era già nota, ma nel caso ci tenevo a postarla.
Secondo l'ultimo test di Virus Bulletin (a cui ognuno gli aggiudicherà la sua importanza) i risultati sono questi:

AlWIL_________ FALLITO
AVG__________ PASSATO
AVIRA ________ PASSATO
BitDefender____ PASSATO
Doctor Web____ FALLITO
Eset__________ PASSATO
FRISK_________ FALLITO
F-Secure______ FALLITO
Kaspersky_____ FALLITO
MicroWorld_____ FALLITO
Norman________ PASSATO
Quick Heal______ FALLITO
Sophos________ PASSATO
Symantec______ PASSATO
VirusBuster_____ PASSATO

xcdegasp
09-06-2008, 16:36
graziemille dell'info :)

leolas
09-06-2008, 19:01
F-Secure FALLITO
Kaspersky FALLITO


:eek: :eek: non me l'aspettavo, però

ShoShen
09-06-2008, 19:06
:eek: :eek: non me l'aspettavo, però

nemmeno io :fagiano:

Gianky....! :D :)
09-06-2008, 19:09
Come Come ???
Kaspersky e f-Secure falliscono!?
Non ci credo.
Ma su cosa si basava questo test ?

epa
09-06-2008, 20:25
The VB100 award was first introduced in 1998. In order to display the VB100 logo, an anti-virus product must have demonstrated in our tests that:

* It detects all In the Wild viruses during both on-demand and on-access scanning.
* It generates no false positives when scanning a set of clean files.


E per chi avesse voglia di leggerlo:
VB100 (virus) test procedures

The VB100 (virus) award is granted to any product that passes the test criteria under test conditions in the VB lab as part of the formal VB comparative review process.

The basic requirements are that a product detects, both on demand and on access, in its default settings, all malware known to be 'In the Wild' at the time of the review, and generates no false positives when scanning a set of clean files.

Various other tests are also carried out as part of the comparative review process, including speed measurements, and results and conclusions are included with the review. The results of these secondary tests do not affect a product's qualification for VB100 certification.
Product submissions

Products must be submitted for review by their developers. Deadlines for product submission, along with details of platforms and the WildLists from which the test sets will be compiled, will be announced in advance of the test.

Submissions must be received by VB by the deadline set, with all required components including updates to virus definitions and detection engines. All software must be made available ready to install, update and run in an 'offline' situation, as all testing is performed in a completely sealed environment without access to any external network.

Submissions are accepted in the form of web downloads (the preferred format), email attachments or hard copies sent by post or courier, as long as they arrive before the deadline.

Testing and certification is open to any product vendor, entirely free of charge. By submitting a product for testing vendors agree to have their product reviewed and analysed as VB sees fit, within the scope of the testing methodology presented below. Once a product has been accepted for testing, it may not be withdrawn from the review by the vendor. However, VB reserves the right to refuse to test any product without further explanation.
Award criteria

The requirements for VB100 (virus) certification are:

*

100% detection of malware listed as 'In the Wild' by the WildList Organization.

The WildList to be used for each test will be the latest available at the time of the test deadline. This deadline will be communicated to potential participants, and publicised on the VB website approximately two weeks prior to the submission deadline for each test. All samples in the WildList collection are verified and replicated from originals provided by the WildList Organization.

'Detection' in this case is accepted if the product clearly marks a file as infected in its log or on-screen display, or denies access to it during on-access testing. If such logging or blocking is unavailable or deemed unusable for test purposes, deletion or attempted disinfection of samples will also be an accepted indicator of detection.

*

No false positives when scanning VB's collection of known-clean files.

The collection of known-clean files includes the test sets used for speed measurements, and is subject to regular and unannounced updating and enlargement. A false positive will be counted if a product is considered to have flagged a file clearly as infected in its log or on-screen display.

A false positive will not be recorded if a file is labelled as something other than malware, such as adware, or a legitimate item of software with potentially dangerous uses. All other alerts on clean files will be counted as false positives.

Flags will be adjudged to mark either a detection, in which case any files marked thus will be counted as detections in the infected set or as false positives in the clean sets, or mere suspicion, in which case no detection or false positive will be recorded. There will be no overlap between the two.

All tests will be performed both on demand and on access. Any failure to detect a sample from the WildList set, in either mode, or a false positive alert in either mode, will result in a product failing to qualify for the VB100 award.

Other tests

Scanning results of several other collections of malware (zoo collections) will be included in comparative testing reports, including analysis of these results in the form of tables, graphs etc. These collections are subject to regular enlargement, updating and reorganisation without advance notice. Detection rates for these collections will be measured in the same manner as for the WildList test set. Contents of all infected collections will be listed on the VB website after test results are published.

Measurements of scanning speed and on-access overhead will also be taken and published as part of the review. Methods used to gather, analyse and present speeds are subject to modification dependent on the requirements of individual platforms and products.
Default settings

A product's default installation settings will be used for all tests, with the following exceptions:

* Adjustments may be made to logging settings to allow adequate information to be gathered to analyse test results.
* On-access scanning will be disabled, where possible, during on-demand testing.
* In the event of a sample appearing to be missed due to a file type not being scanned by default (a common occurrence with, for example, archives not being scanned on access by some products), such samples may be rescanned with altered settings to verify this, in order to inform VB readers of the cause of such misses. Some adjustments to the file types scanned may be made during speed testing, in order to present more informative comparative data. Any false positive raised as a result of such alterations to default settings will not count against a product for certification, but may be recorded in the review text.

Three chances

Should the reviewer be unable to make a product function adequately, either wholly or in part, or should any event occur which appears to be the result of a problem with the installation and operation of a product, tests may be repeated up to a maximum of three times, on two different test machines using clean images for each attempt.
Review text

Each product submitted for testing will be described to some extent in the text of the comparative review published in Virus Bulletin magazine and on www.virusbtn.com, with attention paid to design, usability, features and other criteria considered by the reviewer to be of interest. For the purposes of this analysis, product settings may be adjusted and additional testing carried out at the discretion of the reviewer. Any comments thus made are the opinion of the reviewer at the time of the review.
Right to reply

Should any vendor have any queries concerning the results of the tests, they are encouraged to contact VB for clarification and further analysis where necessary (email [email protected]).
VB100 award

A VB100 award means that a product has passed our tests, no more and no less. The failure to attain a VB100 award is not a declaration that a product cannot provide adequate protection in the real world if administered by a professional. VB urges any potential customer, when looking at the VB100 record of any software, not simply to consider passes and fails, but to read the small print in the reviews. :stordita:

eraser
09-06-2008, 23:02
Hanno fallito a causa di un falso positivo

leolas
09-06-2008, 23:21
Hanno fallito a causa di un falso positivo

grazie eraser per l'info ;) :)

eraser
10-06-2008, 00:07
grazie eraser per l'info ;) :)

Di nulla :)

Aggiungo che F-Secure ha fallito, per ovvia conseguenza, a causa del falso positivo dell'engine Kaspersky.

juninho85
10-06-2008, 09:01
ma avira è pieni di falsi positivi...:blah:
ma kasperky è il top e non ne dà uno che uno...:blah:

oltre ai commenti si trova qualcosa di dettagliato sul test?

epa
10-06-2008, 10:49
Ora cerco qualche altra informazione, ieri ero un po' di fretta per la partita....:incazzed: :tapiro: !!

Gianky....! :D :)
10-06-2008, 10:58
Di nulla :)

Aggiungo che F-Secure ha fallito, per ovvia conseguenza, a causa del falso positivo dell'engine Kaspersky.

Cpt grazie mille.
Xò che vergogna non rilasciare il bollettino per un falso positivo :doh:

epa
10-06-2008, 11:10
Cpt grazie mille.
Xò che vergogna non rilasciare il bollettino per un falso positivo :doh:

E' vero, però ti rimando a quelli che sono i requisiti per passare questo test:
The VB100 award was first introduced in 1998. In order to display the VB100 logo, an anti-virus product must have demonstrated in our tests that:

* It detects all In the Wild viruses during both on-demand and on-access scanning.
* It generates no false positives when scanning a set of clean files.
Quindi oltre a dover riconoscere tutti i virus in scansione e in accesso, non deve generare falsi positivi, e ti posso assicurare che fino a non tanto tempo fa, antivir settato al massimo era una noia assurda con i F.P., ma a quanto pare hanno corretto la sua unica pecca!:rolleyes: ;)

Gianky....! :D :)
10-06-2008, 11:13
E' vero, però ti rimando a quelli che sono i requisiti per passare questo test:

Quindi oltre a dover riconoscere tutti i virus in scansione e in accesso, non deve generare falsi positivi, e ti posso assicurare che fino a non tanto tempo fa, antivir settato al massimo era una noia assurda con i F.P., ma a quanto pare hanno corretto la sua unica pecca!:rolleyes: ;)

Anch'io lo uso ed in effetti ti devo dare ragione.
E' da tempo che non mi disturba per l'euristica (che ho al massimo ovviamente) XD

epa
10-06-2008, 11:50
Come promesso, vi allego due documenti (spero si vedano) dove potrete trovare la lista dei virus utilizzati in questa comparativa, e una spiegazione del termine "in the wild" usato per specificare di che tipo di malware si tratta.

La traduzione del primo documento, è approssimativamente questa:

L'organizzazione di WildList raccoglie i rapporti mensili del virus dagli esperti in antivirus intorno al mondo. I dati dai rapporti sono compilati per produrre il WildList - una lista di quei virus che attualmente si spargono in una popolazione dei vari utenti. Un virus che è segnalato da due o più dei reporter di WildList comparirà nella parte-metà della lista ed è ritenuto per essere ' Nel Wild'. Recentemente, la lista è stata usata da Virus Bulletin ed altri tester del prodotto di antivirus come la guida definitiva per i virus trovati nel mondo reale. Un prodotto di antivirus si pensa che noti la rilevazione 100% contro questo gruppo di virus.

WILD LIST (http://rapidshare.com/files/121401431/The_Wildlist.doc.html)
LISTA VIRUS UTILIZZATI NEL TEST (http://rapidshare.com/files/121401543/VB100-The_wildlist.doc.html)

cloutz
10-06-2008, 11:55
io nn risco a vederli....una volta scaricato non mi fa aprire la pagina html...
probabilmente è riservata a chi è registrato a Virus Bulletin...

epa
10-06-2008, 11:58
Ho salvato il tutto in .doc, ora non dovrebbero esserci problemi!
:ave:

cloutz
10-06-2008, 12:23
grande:sofico:
thanks

Pat77
10-06-2008, 14:11
Continua il periodo no di Kaspersky, anche av-comparatives l'ha bastonato un po'.

_MaRcO_
10-06-2008, 14:29
grazie dell'info epa

CiAo

epa
10-06-2008, 14:44
:yeah: Un piacere!!