Solved

Exclude folders from scanning?



Show first post

176 replies

@ wrote:
I know. Which means the person who provided the files never actually scanned them with Webroot.
I had WebRoot running all the time - but I was not aware that folders on my Virtual Machine (which are taken as Network Folders) have to be added manually to the folders to be scanned... I added them now and it found one virus (I guess I did it yesterday or so). I cannot fully recreate the situation since I had running other Antivirus Programs inbetween which already removed some of the suspicious code. Meanwhile, I have uninstalled all the other Antivirus Programs and am left with WebRoot  alone.
 
All other Antivirus Programs did not require NetWork FOlders to be specified - and they scanned all folders acessible to the computer. This maybe is a suggestion to WebRoot to change - make as default a scan on all acessible folders.
 
Secondly, I had some archives on my computer, which I never had extracted on my machine... I only got aware of these suspicious files actually being on my computer after running an Antivirus Program on the MacOS Side of my computer.
 
As a resumé, am I right to put it in like this?
 
  1. WebRoot does not automatically scan all network folders. They have to be added manually.
  2. WebRoot does not scan archives (in comparison to other Antivirus Programs)
  3. Point 1. & 2. explain why some of the actual threats were not detected on my Machine, while WebRoot actually would identify them as threats.
  4. The remaining suspicious files which were identified by other programs and not by WebRoot in fact are not actual threats.
 
Is this correct?
Userlevel 7
Scanning everything or a substantially larger group of things is not going to happen, I can tell you that straight away.
 
Millions of people use Webroot specifically because it does not scan everything. It's smart enough to know what to scan, unlike other AV programs that aren't intelligent enough to know what's possibly a threat to the computer they're running on.
 
A file in an archive can do nothing until you extract a copy from the archive.  Why waste time to scan it in the archive if it can't do anything in there? If you pull it out, then it'll be scanned.
 
I do know that at least some of the files you identified as other AVs claiming to be threats definitely are not threats. I cannot speak for all oft hem as I have not seen all of them.
 
So:
1: Webroot does not automatically scan the vast majority of files on a computer and does not scan attached network shares. There is no reason to scan everything on the network shares to protect your computer and doing so on some networks would be highly detrimental to the networks. At work with the old AV, I had to tell it to -ignore- all network shares because the workstations all scanned at the same time and the server slowed to an utter crawl every day at that time because of it.
2: Webroot -does- scan Archives in specific situations which honestly I cannot remember at the moment. Many archives will not be scanned since their contents are inert until extracted and thus scanning is detrimental to system performance for no benefit to the security of the system.
3: Point one definitely at least, yes.
4: At least some of them are not threats, correct. I can only speak for the ones I saw and examined directly, not all the remaining ones, most of which I  was not given to examine directly.
@ wrote:
Kit, or anybody else,
 
can someone again explain me in simple terms what the main difference is between WebRoot and others Antivirus Applications?
 
Why is WebRoot using substantially less Resources than others?
 
 
Thank you
Thank you for the reply Kit. So this means that WebRoot only catches or kills a threat when it actually could become a risk. Or in other words, as long as it is impossible to become a threat, WebRoot does not take action (for example a virus in an Archive, a virus on a network folder, etc....)
 
Is that the main reason why WebRoot does use substantially less resources than others Anticirus Apps? (see my quoted question above).
 
Thanx
This is a very true and excellent point,

dear Kit.

Since it's valid for over 99% of the cases (only the 1% or much less have a virtual machine on their computer), I fully agree to leave WebRoot ignore network locations BY DEFAULT.

Thanx for that excellent point.

------------- Kit wrote:
1: Webroot does not automatically scan the vast majority of files on a computer and does not scan attached network shares. There is no reason to scan everything on the network shares to protect your computer and doing so on some networks would be highly detrimental to the networks. At work with the old AV, I had to tell it to -ignore- all network shares because the workstations all scanned at the same time and the server slowed to an utter crawl every day at that time because of it.
Userlevel 7
In cases of VMs, technically Webroot should be run on the VM.  Obviously in the case of a Mac VM, this ends up being a different process. No AV is perfect. Even the user will create security holes. No AV can stop somebody from doing something unexpected or dumb. ;)
 
So, Webroot is Extra-Light partially because it is smart enough to only scan what can actually be bad, yes. There are other things that contribute to being light though. 
 
Certain generic data is collected regarding every file that is scanned by every AV. On Other AV, this data is then compared against the local database of "Bad Stuff".  That database can get pretty big (hundreds of megabytes). It has to remain in memory or it will chew up the disk all the time instead. And each record it goes through is multiplied by each extra file it scans. 5,000,000 records times 300,000 files means a heck of a lot of comparisons. Even one file against 5,000,000 records is a mess of work comparatively. And while 5,000,000 may seem like a lot, it's a fraction of the bad stuff out there.
 
Webroot gets the data, then sends it to the servers on the internet. They have a heck of a lot more power than any one computer and they have access to somewhere like 500,000,000,000 pieces of information instead.  (Number pulled from under a carpet.  Maybe somebody with current data can chime in with how many rows are in the Enzo DB now).  So instead of your computer getting 30 pieces of generic data about a file and comparing it to 5 million things every time you access a file, it gets those 30 pieces of generic data and lets the server compare them to 500 billion things and send back the response.
 
But it's even more than that too. Other AV looks at the file and gets 30 datas. One of those datas is the hash signature of the whole file, which indicates whether the file has changed at all. So the whole file needs to be read to get that data. And every time the file is accessed by something, the AV gets those datas again and compares them against 5 million things each time.
 
Webroot reads the whole file to get those 30 datas also.  But it always gets the full hash first. If it sees that the hash is the same as what it got before, it knows that the rest of the data is the same, so it doesn't waste time collecting that data again. That's a good thing.
 
"But what about having to read the whole file to get the hash? That takes a long time!"
Yes, yes it does. But... When Webroot is running, it knows everything that touches the disk to write. So if the file has not been touched by anything, Webroot doesn't even need to get that hash.
 
And even more importantly: Other AVs are statically prophylactic only. They -must- catch the threat before it starts doing anything at all in machine code otherwise they cannot prevent the infection. That means that before something can run or be accessed, it has to be fully-checked. Thus the access is put on pause while it is checked.
 
By comparison, Webroot has a much larger buffer for what it can do to protect. Even if a threat has started running, everything it does is watched, recorded, and thus revertable. Critical access to sensitive places is blocked until things are determined to be safe, but most programs don't ever try to access those sensitive places. So from the program's point of view, it's not stopped or inhibited at all. If the thing turns out to be a threat, it's killed and everything it did is undone.  Even days later, but usually within seconds or minutes.
 
So:
 - Scan only the important things that need to be scanned to protect you fully.
 - Remember information for quick access so things don't need to be repeated over and over again repeatedly over and over again.
 - Don't do all the work of checking things on your computer. Your computer has better things to do with its CPU and disk. Check the things in the cloud.
 - Don't put things on hold unless they are critical deep access that can allow evading Webroot. Most programs don't go that deep so don't get put on hold.
 - Have a developer that prefers to write ten lines of highly-efficient code to make something happen instead of 200 lines of junk that works, but is icky. (Heck, even I did stuff like that. Our old WLogs program was about 5MB or so and had a whole bunch of weird ways to try to do things, and did them slowly, about twenty minutes or more to run. When I wrote WSALogs, I focused strongly on doing it all more efficiently and quickly and doing it all in a smaller package. Under a megabyte and usually runs in three minutes or less.)
 
WSA is a lot of looking at the things that make other AV programs heavy and finding solutions that make WSA light.
Kit,

Thanx or the overly exhaustive explanation. Does that mean that WebRoot needs to have Internet connection in order to identify viruses or malicious code?

Thanx.
Userlevel 7
@ wrote:
Does that mean that WebRoot needs to have Internet connection in order to identify viruses or malicious code?
No, but...
 
Every AV needs the internet to identify threats accurately. Even if you download the newest version of other stuff available to install, it will still be quite some time behind on its patterns and definitions. The downside is that if the other stuff misses something because a network connection was missing and so it couldn't get its info, it's pretty much game over. The threat has free reign and can do anything it wants to without the other AV knowing what happened, because the other AV only knows "It's not on my list of bad things, so it must be good". All AVs require an internet connection to work well, but they will at least work some without it.
 
Webroot also needs an internet connection to identify threats most accurately. When it knows it has none, it goes into effectively Hyper-Paranoid mode and watches things much more closely. It can still detect and clean up threats in this situation, but not nearly as effectively. Which is why the requirements for the software (just like other AVs) include "Internet Connection". Webroot requires an internet connection to work well (and won't even install without one), but will not only work some without it, but also can completely 100% recover the system from any infection that gets in while the network is down.
 
However that's a tricky thing. The vast, vast majority of threats come from the internet, so when the network is dead and the AV (Any AV) is less-effective, the liklihood of catching a threat is also reduced hugely by the air gap. The air gap itself is a major way to avoid infections.
 
Yes, something could be downloaded before, but Webroot does scan downloads realtime. Downloaded and moved to a USB stick and then introduced to the machine when the network is off? The chances of that being done by any of the millions of Webroot users is so exceptionally slim it's not even funny. Never been done by accident. Only proposed in theoretical cases or instigated intentionally for testing purposes.
 
So:
No, Webroot does not need the internet to identify threats, BUT: It does need them to identify well and accurately, BUT: So does every other AV in the long run, AND: Webroot without the internet does better than any other AV with no definitions or old definitions, AND: Items that are undetected without the network are still fully removed when the network returns and any damage that could have been done is undone, AND: The chance of getting a new infection without internet is next to none and has never been seen in legitimate user cases.
 
Next on my agenda, I think I will do some cross-checks against claudiu and encourage Jim to do the same.  Then... DINNER!
This is the feedback of SYMANTEC NORTON on submitting the suspicious samples:
 
From: Symantec FP Incident Response <falsepositives@symantec.com>Subject: [No Reply] False Positive Submission [3250995]Date: July 15, 2013 11:26:46 GMT+02:00To: sndbbbl
In relation to submission [3250995].

Upon further analysis and investigation we have determined that the file(s) in question meet the necessary criteria to be detected by our products and as such, the detection cannot be revoked.
 

They wrote some more text which was irellevant to the result. No further details were provided by them.
Userlevel 7
As there is no information on what files they are saying this to, nor on what their criteria for detection are, there is nothing that can be said regarding it other than: There is nothing that can be said regarding it.
 
Transparency is important. For example, as Roy said, we don't mark things that are legitimate software in many cases, such as toolbars. So without knowing specifically what the file in question is and what specific Norton criteria it meets, the information is not useful to anybody to make an educated decision on anything.
@ wrote:
As there is no information on what files they are saying this to, nor on what their criteria for detection are, there is nothing that can be said regarding it other than: There is nothing that can be said regarding it.
 
Transparency is important. For example, as Roy said, we don't mark things that are legitimate software in many cases, such as toolbars. So without knowing specifically what the file in question is and what specific Norton criteria it meets, the information is not useful to anybody to make an educated decision on anything.
Kit,
 
yes. And I think that more can be said:
 
Their comment "meet the necessary criteria to be detected by our products"... is almost as much as saying:
 
  • Our products have detected the suspicions
  • We never doubt our products
  • So believe it that we are correct
which as you say is not a really substantial e-mail and no reply with content.
 
Thank you all for your input.
 
Userlevel 7
I wouldn't necessarily say that, per se. Not a matter of trusting the program, but rather saying they trust their rules and criteria completely. However without knowing what the criteria are, it's difficult to say whether something is actually a threat.
 
For example, "If it modifies the machine code of another binary and isn't signed by the same trusted certificate as the file being modified, it's a threat" will potentially catch a lot of legitimate patches and third-party patches. If their criteria include "Make changes to an installed program in some manner that allows evasion of DRM", then something as simple as a program that puts a text string in the registry to allow something to extend its trial would be considered an infection by their AV, even though it really isn't.
 
Your observation is more along the lines of, "This file was detected by this definition that says it does X, Y, and Z. But even though this specific file doesn't actually do that, the definition says it does so that's good enough".
 
Webroot tries to be clear on what the criteria are and what a specific file does exactly that got it triggered on. That way other educated people can verify separately as well and make educated decisions.
 
Userlevel 7
It could also be simply detecting a PUA that Webroot doesn't bother with.. something like a toolbar. No way to know given the information given.
Userlevel 1
My Norton subscription is running out, so I'm checking back in here to see if Webroot has come to their senses. Alas, still being stubborn it seems, even though a business user submitted a long list of good reasons to allow folder exclusion:
http://community.webroot.com/t5/Ask-the-Experts/Re-Exclude-folders-from-scanning/td-p/7666
 
Looks like Norton can count me in for another year.
Userlevel 7
@ wrote:
My Norton subscription is running out, so I'm checking back in here to see if Webroot has come to their senses. Alas, still being stubborn it seems, even though a business user submitted a long list of good reasons to allow folder exclusion:
https:///t5/Ask-the-Experts/Re-Exclude-folders-from-scanning/td-p/7666
 
Looks like Norton can count me in for another year.
I have a list of reasons to not include folder exclusions:
A work search of client systems with a work line item of "Threat Cleanup - AV Missed - User Error - Exclusion"
 
But no problem, because hey, Norton doesn't want you to get infected, (Norton Virus and Spyware Removal Service, $99) right? So they must know best.  Webroot must be crazy to protect you like that. ("Webroot offers free infection remediation with a current subscription.")
 
Edit:
By the way, the post you like to has no good reasons at all. "It removes all these tools!" he said. "No it doesn't" I pointed out. (In fact, it still doesn't. I have dozens of tech utilities, Nirsoft, Password recovery, etc, and Webroot have never once flagged on them.) And as I pointed out a year ago, the business product is centrally-managed, so excluding the utilities once centrally will exclude them on all endpoints.
 
At this point, I really don't foresee Webroot changing their stance on this and reducing the protection of millions of people just to satisfy a few for no actually-good reasons. Give a legitimate case where folder exclusion is the best and only solution for a majority of customers and it might be revised.  Until then, enjoy Norton's computer resource usage and $99 virus removal fees if they miss something.
Userlevel 1
Do your homework, Kit. Norton's computer resource usage is barely noticeable these days. And I wouldn't dream of paying them or anybody else $99 to reinstall Windows - which is what viral removal mostly comes down to. I've been doing this for free for about 20 years for my extended family and friends.
 
I know that I cannot change your mind and you know that you cannot change mine. Let's call it a draw 😉
Userlevel 7
Badge +6
An AV's true impact can only be measured by impact on real-world tasks and loading times. Task Manager does not show true system usage, especially for things like antivirus.
 
Norton is a pretty good product these days - they really turned it around compared to what it used to be.
I got a mail sayin that this TOPIC has been solved... I did not yet get a solution to my problem with running iMonitor. I am still waiting for tech support feedback.

Please don't ask us if we can make the majority want this etc. I from my point of view have a correct application which hinders me from using WebRoot.

I only got a reply saying: we look into it, but cannot tell what the outcome is. And I wait several months now without any feedback.

Fact is that on several hundred PC's in our office, we cannot use WebRoot now. On my private one neither - for the same reason.

Thank you.
Userlevel 7
@ wrote:
I got a mail sayin that this TOPIC has been solved... I did not yet get a solution to my problem with running iMonitor. I am still waiting for tech support feedback.

Please don't ask us if we can make the majority want this etc. I from my point of view have a correct application which hinders me from using WebRoot.

I only got a reply saying: we look into it, but cannot tell what the outcome is. And I wait several months now without any feedback.

Fact is that on several hundred PC's in our office, we cannot use WebRoot now. On my private one neither - for the same reason.

Thank you.
A topic in the forum being solved just means it received a "correct answer" in the community forums. It should be kept in mind that sometimes a corrrect answer is not the answer people are looking for, and often is not a personalized support answer.
 
In the case of several hundred PCs at the office, manually adding the specific files for the greyware as overrides on the enterprise console should resolve the issue. If all of the files are overridden by the organization as Good and the issue is not resolved, it would require support system contact to specifically make a ticket that is tracked by and can be addressed by developers. Detailed examination of logs would be necessary to resolve the issue and see if it is simply a file that was missed or a General Interaction that needed to be revised.
 
I'm still somewhat curious what's going on in your case, since I was able to get iMonitor to run and operate just fine on a test machine with Webroot installed.
Dear Kit,

I sent all including log files to Support after opening a ticket. I got only a reply that the issue has not been solved yet. And it is still not yet decided when and whether at all this issue will be adressed any time soon. Some features of iMonitor are not running with WR active. We tracked down what feature it was - no solution as of now.

Making a software boot without error does not mean that it performs in all functions and features as requested. For me the issue remains in solved until I get a solution that works.
Userlevel 1
At this point, I really don't foresee Webroot changing their stance on this and reducing the protection of millions of people just to satisfy a few for no actually-good reasons. Give a legitimate case where folder exclusion is the best and only solution for a majority of customers and it might be revised.
It's unfortunate. I will also not be renewing my sub due to this issue. I am sorry, but Webroot's stance on this just comes off as arrogant, as in they know better what's good for me.
I don't see how providing an option that novice users would not be even aware of would "reduce the protection of millions of people." I see very clearly how disabling webroot to circumvent its getting in the way would. 
But the issue here is not what is the best way to prevent annoying false alarms. A number of users requested this feature. They want to do things this way. They told you so. Coding something like that in sounds very simple. But if Webroot intends to dictate on what I can and cannot do on my computer, I will go to a competitor that is more flexible.
Userlevel 7
Its not arrogance I assure you, you think its a simple and effective idea but in reality its a terrible idea. Even as it stands I see people getting infected by adjusting the settings or allowing infections by altering our determinations. Even people who are quite technical I have seen get themselves infected by allowing files that they think are good or ignoring our warnings.
 
Our job is to protect our customers first of all and sometimes it means we have to take a tough stance on certain issues.
 
Imagine the chaos that could happen if somebody excluded there entire c:, may seem crazy but I have seen people do it with other products!
Userlevel 1
@ wrote:
Even people who are quite technical I have seen get themselves infected by allowing files that they think are good or ignoring our warnings.
 
Our job is to protect our customers first of all and sometimes it means we have to take a tough stance on certain issues.
 
Imagine the chaos that could happen if somebody excluded there entire c:, may seem crazy but I have seen people do it with other products!
See, by that rationalle, you should also remove the option to disable protection. I am sure plenty of users got infected that way too. You should prevent people from uninstalling Webroot period, for whatever reason, because that also could easily result in infection.
You can't protect users from themselves. When you try, you provide an easy excuse for lack of functionality, which just angers people.
Thank you railshot. There are many alternatives. I was with Webroot for my entire family, and had planed to use it in our corporate business. But while waiting for a reply and while getting always the same replies that WebRoot is close to perfect, I looked into alternatives where in some cases I got really amazing support, where people actually listened to our issues, requirements, etc.

Cheers and all the best. I am out of here.
Userlevel 7
It is a very difficult thing to balance: advanced users want and need the flexibilty. At the same time I have seen many many non-advanced users adjust settings without knowing what it will really do and end up infected as a result.

Some vendors charge the end user for infections that occur even while their computer is proected by the product, some like Webroot do not. It costs the vendor a lot of money to provide free disinfection services so that probably needs to be taken into account as well.

To be honest...I am a fence sitter on this whole issue. As a user myself I want more control. At the same time I also dread it in terms of the problems that can cause.

I think it will be interesting to hear more about this whole discussion.
Why shall we convince Webroot about the reasonability of what some of us want, if what we want is available already by other developers? Most others have this feature - there must be a reason. What we think is reasonable has already been clearly rejected by Webroot as not reasonable. It's a clear case.

Reply