Quarterly Threat Trends March 2018


Userlevel 7
Badge +48


 

In This Issue: The Threat Landscape: Past and Present

 
This quarter, we highlight the newest edition of our annual Threat Report. Download your free copy for a deep dive into the most important cyberattack trends of the past year, and what they indicate for the future. You can also view our Threat Report infographic to get the highlights from the report, or watch our BrightCloud Threat Intelligence video to see how our innovative threat platform uses machine learning to identify and stop emerging threats in real time. Additionally, we encourage you to view the 2018 Cyberthreat Defense Report from the CyberEdge Group to gain deeper insight into how enterprises view and handle modern threats.
 
See the full report!
 

19 replies

Userlevel 7
Badge +34
Thanks Drew - makes very interesting reading.
Userlevel 7
Badge +48
You bet! Lots of great information in there. 
 
Couldn't believe the polymorphic stat! 
 
 
Userlevel 7
Badge +34
Indeed and proves the total ineffectiveness of signature based security solutions.
"94% of malicious .exes were POLYMORPHIC
In 2017, 94% of the malware we encountered was seen on just one machine in the world."
 
Wow! Where does that polymorphic stat leave AV Testing Organisations' malware testing methods??
Userlevel 7
Badge +34
@ wrote:
"94% of malicious .exes were POLYMORPHIC
In 2017, 94% of the malware we encountered was seen on just one machine in the world."
 
Wow! Where does that polymorphic stat leave AV Testing Organisations' malware testing methods??
Outdated.
Quite.
Userlevel 5
Badge +9
@ wrote:
"94% of malicious .exes were POLYMORPHIC
In 2017, 94% of the malware we encountered was seen on just one machine in the world."
 
Wow! Where does that polymorphic stat leave AV Testing Organisations' malware testing methods??
Awww @Muddy7 you just had to ask that, didn't you? I'm going to talk about that in another testing blog post. It's already in the draft :-)
OK, the short answer is that the composition of the samples used in the test must take into account all of these one-offs and that the analysis is critical to explaining what the results mean. It also means that a darned good ability to detect polymorphic malware is essential.
The whole point of my coming series of testing blogs is not to bash testers, it is to explain why it is so hard to do testing and why a vendor might look bad if the proper test construction and highly insightful analysis is not done correctly.
Yep, it isn’t impossible to compensate for to a large degree, but it does mean that whole product end-to-end testing is essential and that without quality analysis the results do not mean much. The analysis is much harder than the testing. Exactly correct analysis is actually impossible, there are too many variables.
 
I don't lose any sleep over it though. Even I'm not geek enough to go to bed thinking about antimalware testing 🙂
Thanks for your thoughts, @. I look forward to your blog post.
 
Btw if it's not an indiscreet question, where do you think the AV Testers currently are with regard to the benchmark you set in the aforesaid post?
Userlevel 5
Badge +9
@ wrote:
Thanks for your thoughts, @. I look forward to your blog post.
 
Btw if it's not an indiscreet question, where do you think the AV Testers currently are with regard to the benchmark you set in the aforesaid post?
@it's not an indescreet question. The answer is that I don't kow how to establish a quantitative benchmark. It's a complex moving target and I don't have access to the testers internal designs and how their progress at being able to test WSA and other advanced endpoint protection products is at.
Userlevel 7
Badge +34
Regarding the polymorphic malware, the article merely states that these were "seen"  by Webroot.  How many of these compromised the end user and how many were prevented from doing so, it does not say. It would be interesting to know.
Userlevel 7
Badge +34
As a longstanding user of WSA I would really like to know how WSA performed against these zero-day threats. Just ignoring the question leaves one to fear the worst.
Userlevel 5
Badge +9
@, regardless of how long-standing, you deserve a reply! We are glad you are a long-standing user though.
 
Frankly, I don't think the answer is quite as satisfying as either of us would like because we are talking about measuring the unknown. You probably know this, but to be clear for everyone, 94% is a ratio of prevalence. If we detect 200,000,000 samples and only 12,000,000 are seen frequently then 94% of what we see are one-offs. I know from discussions with friends who work at other companies that this figure is in-line with the industry, but it doesn’t quantify detection. Virtually no company has seen the 188,000,000 of the 200,000,000 sample set that we have, nor we theirs because these polymorphic samples frequently hit just one to ten users in the world.
 
It gets harder though. Those polymorphic samples are typically members of a variety of malware families. Let’s assume there are ten families of polymorphic malware. If a company’s heuristic (machine learning is a heuristic) detection is great for the top 5 families then you’re probably better protected than a product that has great heuristics for the bottom 6 families.
 
Beyond static detection there is also the protection that is offered by blocking behaviors once the malware is executed. There is also journaling and rollback that contribute to actual protection. Those are some of the aspects of protection that we, and several other vendors, are trying to get worked into new testing methodologies.
 
I would love to be able to give you hard numbers, but we don’t see a big number of these actual zero days because one to ten other vendor’s customers got exposed to them. It doesn’t mean we wouldn’t have protected against them. I’ve worked in or with the industry since 1997. The inability of any of the vendors to empirically quantify this stuff drives me nuts.
 
I should add that there is one metric we can accurately quantify. Protection, system impact, and customer service are the major causes of churn (switching vendors) in the industry. With retention rates being roughly 95%, it can be deduced that almost all of our customers are happy with our performance in these three metrics. 
 
Thanks,
 
Randy
Thanks for that very interesting and informative answer, @.
 
And the answer to @'s question (if we take into account only those polymorphic samples that Webroot has seen on its users' devices)? Put another way, how many have escaped Webroot's heuristic, identity and journalling (plus whatever else I've missed in that description!) protections and managed to do their nefarious worstest?
Userlevel 7
Badge +34
Thank you Randy for your response. I can't say I FULLY understand but I get your drift. However, going back to the original post, it seems to me that the casual reader of the article would assume that WSA actually detected most, if not all, these polymorphic malwares, perhaps even at the initial stage via heuristics or shields, whereas the truth appears to be rather different.
 
Regarding your last paragraph, I would add that I came to Webroot initially because I was particularly impressed with its light system impact and the reviews I read gave good reports on protection and customer support.  So I am not surprised to read of your excellent retention level.
Userlevel 5
Badge +9
@ wrote:
Thanks for that very interesting and informative answer, @.
 
And the answer to @'s question (if we take into account only those polymorphic samples that Webroot has seen on its users' devices)? Put another way, how many have escaped Webroot's heuristic, identity and journalling (plus whatever else I've missed in that description!) protections and managed to do their nefarious worstest?
Hi @ I wish I could answer your question, but I don't know. Pretty much the answer any representative of any security company should give is - too many. Cliche? Perhaps, but if misses are acceptable, then the effort to toward improving is insufficient.
 
I'm not sure that I can get a count and I know I can't put it in relative performance context of relative protection. I have never seen any company publish that information. Without that context there are no relative performance metrics. 
Userlevel 5
Badge +9
@ wrote:
Thank you Randy for your response. I can't say I FULLY understand but I get your drift. However, going back to the original post, it seems to me that the casual reader of the article would assume that WSA actually detected most, if not all, these polymorphic malwares, perhaps even at the initial stage via heuristics or shields, whereas the truth appears to be rather different.
 
Regarding your last paragraph, I would add that I came to Webroot initially because I was particularly impressed with its light system impact and the reviews I read gave good reports on protection and customer support.  So I am not surprised to read of your excellent retention level.
Hi @ The Quarterly threat trends report is not about how many we catch, it is talking about the trend we see of the increasing number of samples that are one-offs. It isn't saying 94% detection of all polymorphic threats, but rather that 94% of the threats we see are polymorphic. There was a day when most of the threats seen were perhaps 94% static and 6% polymorphic. The trend is toward polymorphism and so technologies to address the changing threatscape have to focus on that. If you don't know your enemies tactics you can't effectiely adjust your own.
 
@ wrote:Hi @ I wish I could answer your question, but I don't know ...
(and)
... I'm not sure that I can get a count and I know I can't put it in relative performance context of relative protection ...
Pity you don't—and can't!
 
What I DO know from my personal experience is this: to date, I've never knowingly been infected using Webroot SecureAnywhere, which I've used continuously since 2011. Before using Webroot, I used Prevx, which was acquired by Webroot and on which they completely rebuilt their AV technology—and the same was true for me with Prevx as with Webroot.
 
On the other hand, before using Webroot or Prevx (pre-2007), I used several different reputable AV names, and yet I (my company computers and then I, when I went freelance) seemed to regularly get infected every 3 to 6 months or so.
 
Incidentally, I don't quite understand what you mean when you say "put it in relative performance context of relative protection" and "relative performance metrics". By "relative", do you mean relative to other AV products? If so, that makes sense. Though it's still a pity you don't have the statistics on Webroot's failure rate (or success rate—it comes down to the same thing ;)) with polymorphics.
Userlevel 5
Badge +9
@ wrote:
@ wrote:Hi @ I wish I could answer your question, but I don't know ...
(and)
... I'm not sure that I can get a count and I know I can't put it in relative performance context of relative protection ...
Pity you don't—and can't!
 
What I DO know from my personal experience is this: to date, I've never knowingly been infected using Webroot SecureAnywhere, which I've used continuously since 2011. Before using Webroot, I used Prevx, which was acquired by Webroot and on which they completely rebuilt their AV technology—and the same was true for me with Prevx as with Webroot.
 
On the other hand, before using Webroot or Prevx (pre-2007), I used several different reputable AV names, and yet I (my company computers and then I, when I went freelance) seemed to regularly get infected every 3 to 6 months or so.
 
Incidentally, I don't quite understand what you mean when you say "put it in relative performance context of relative protection" and "relative performance metrics". By "relative", do you mean relative to other AV products? If so, that makes sense. Though it's still a pity you don't have the statistics on Webroot's failure rate (or success rate—it comes down to the same thing ;)) with polymorphics.
@ Yes, in relation to other products. Good and bad are relative values.
Userlevel 3
Thanks for the info. Kudos

Reply