Solved

Exclude folders from scanning?


Userlevel 1
Is it possible to exclude certain folders from scanning?
 
Thank!
icon

Best answer by Kit 7 May 2012, 16:16

There we go. Having the reasoning makes it easier to address your concerns. Keep in mind, you're not talking to a Corporate Stooge/Tier 1 tech here.  You're talking to a real person with 17 years in the security industry who works here because it's the best solution out there.   Yes, I frequent /. and Wilders and various others too.  You'll also find a lot of Wilders users on here.

 

For one, I have huge archives, ebook collections and other stuff that doesn't need to be scanned over and over again.

 

Not a concern.  We don't operate that way anyway, unless you're doing something silly like Full Scans regularly as opposed to the default deep scans.  Remember, we're not the Other Security Software.  We already know the eBooks can't contain threats, and we don't care about the archives.  There's a reason we've been consistently the lightest and fastest since we came out.  Simply put, the agent knows what can run, what is likely to run, and what will run.  It doesn't worry about data files that cannot contain machine code, and in the event of things such as buffer overflow remote code execution, it uses a realtime system at that time rather than trying to determine which files could possibly cause an issue.

 

Trust me, my wife used to have to set to exclude her art directories (yay for monolithic high-resolution PSDs), and she doesn't anymore because even she recognizes that the agent doesn't even bother to glance at them.

 

I also have some password recovery programs that are always mistaken for a virus and then sometimes simply deleted (especially by Bitdefender).

 

This is a trickier one.  No matter what else you use, you're stuck in a situation where "Ignore this file proactively" is a need in those cases.  Norton requires you to ignore the file too.  That being said, we -never- "simply delete".  Will we detect things as "Hacking Tools" if they can be used for nefarious purposes?  Yes.  However unless you're uninstalling and reinstalling the agent frequently, this is a non-issue.  A one-time "Put that back and ignore it" solves that issue without leaving the figurative key beneath your doormat. 

 

Unfortunately, despite my own opinions and everybody else's, we really do have to do what's best for "The majority", so your much-derided "Because it's what's good for you" actions are necessary.  While you may be able to recover from a severe infection in minutes, the majority of people can't and it's a catastrophic event.

 

I know how it works and I like to be in charge and take decisions. I don't like software telling me what I can and what I can't do.

 

Been there.  Still there.  Honestly, I get much greater granularity of control from WSA.  Even Norton has limitations.  Try telling it to use under 6MB of RAM while idle. Try telling it to "ignore the low level utilities in this directory, but please do catch any actual threat that goes in there"...  The former is done by an exclude, but the second half of the latter is not-done due to the exclude.

 

It kind of sounds like you've been doing full scans.  Don't.  That's not an order, mind you.  That's a recommendation.  Don't waste your time making the WSA agent pretend to be a slow, old, bloated security client.  It's just not necessary, nor does it really help anything.  Plus it'll cause you to have more desire for exclusions. Under normal operation with deep scans, it works faster, more efficiently, and there is no need for exclusions.

 

Anyway, it's all up to you.  Every software package will have its own upsides and downsides.  On the firewall side, I use the Windows firewall to pre-emptively block software from accessing the internet. Others use the firewall of their choice. The SecureAnywhere firewall is specifically intended to be an anti-threat firewall, not a firewall made for user control decisions.  That's why it can run along with any other firewall. 

 

I answered this because even if you went back to Norton, you probably wouldn't have responded if you weren't looking for an answer.  Oh, and if you truly decide that ditching the software is the way to go, we give a 70-day Money Back guarantee if you got it through an authorized source.

 

That being said, if you have low-level questions, please do feel free to continue to ask.  I do have answers.

 

Edit: Spell-Checking again (The original was posted from home with no time to finish completely.  *Oops*).  Anyway, one of the things I would ask you to keep in mind is that while you can see your desires and actions, we have to take into account the needs and actions of millions of users.  I think that when you said "semi-intelligent adult", you grossly underestimated your skill level.  We used to see all the people who were infected because they followed instructions on web pages that explained precisely how to exclude a directory from scanning, followed by the saving "this program you want" to that directory and running it.  If ever "This is why we can't have nice things" applies, this is an example.  And that's just one of the ways exclusions caused problems.  So we really are working hard to make all of the classic and other possible reasons for wanting an exclusion function obsolete so that a blanket exclusion is not necessary.
View original

176 replies

Userlevel 7
Badge +34
@ wrote:
Hey, Dan, how did this get anything to lock? Last time I watched Webroot with ProcMon, it never did any exclusive opens or locks until it found something it explicitly didn't like (Full Positive). Did the code base change that dramatically?
Hi Kit,
 
Without additional information I could only speculate... we'll at least need to see logs, etc.
 
-Dan
Userlevel 7
Hey, Dan, how did this get anything to lock? Last time I watched Webroot with ProcMon, it never did any exclusive opens or locks until it found something it explicitly didn't like (Full Positive). Did the code base change that dramatically?
Userlevel 7
Badge +34
@ wrote:
As a coder, you should be familiar with stream (including file) open commands. You pick the file, you pick the mode, you get a handle.
 
So why am I geting locked files that are causing compliation to fail? Lockhunter reports WRSA holding the locks.
 


Please Submit a Support Ticket for this case. Shutting down WSA when compiling may be a temporary solution, though obviously not ideal. 
 
-Dan
As a coder, you should be familiar with stream (including file) open commands. You pick the file, you pick the mode, you get a handle.
 
So why am I geting locked files that are causing compliation to fail? Lockhunter reports WRSA holding the locks.
 

Userlevel 7
Badge +34
@ wrote:
Here is Google's "strong" advice re. installation of its G Suite syncing software, to permit MS Outlook users to sync their pst files with G Suite email, etc.:
"Dear G Suite Sync user,

Thanks for installing G Suite Sync for Microsoft Outlook®. This software will synchronize your calendar, contacts, email, notes, tasks and domain's global address list with G Suite. Before you get started, there are a few things you should know about the current version of G Suite:
• Your journal entries will not synchronize with G Suite.
• G Suite Sync will initially download up to 1GB of email from the G Suite Server to your desktop. You can change this setting from the system tray menu. (learn more).
• Your initial sync can take a long time, because there's a lot of email to download. To see your synchronization status, look at synchronization status in your system tray.
• We strongly recommend that you create an exclude rule in your antivirus software so that it does not scan any files located under %LOCALAPPDATA%GoogleGoogle Apps Sync.
For more information, go here:
• G Suite Sync User Manual
• What's different and won't sync between Microsoft Outlook® and G Suite
• Read the Frequently Asked Questions
• Tools for administrators deploying G Suite Sync
• How to get help
Thanks for using G Suite Sync,
The G Suite Sync Team"
These suggestions are for "traditional" AV software. Since Webroot SecureAnywhere functions differently, exclusions like these are only needed in rare cases. 
 
-Dan
Here is Google's "strong" advice re. installation of its G Suite syncing software, to permit MS Outlook users to sync their pst files with G Suite email, etc.:
"Dear G Suite Sync user,

Thanks for installing G Suite Sync for Microsoft Outlook®. This software will synchronize your calendar, contacts, email, notes, tasks and domain's global address list with G Suite. Before you get started, there are a few things you should know about the current version of G Suite:
• Your journal entries will not synchronize with G Suite.
• G Suite Sync will initially download up to 1GB of email from the G Suite Server to your desktop. You can change this setting from the system tray menu. (learn more).
• Your initial sync can take a long time, because there's a lot of email to download. To see your synchronization status, look at synchronization status in your system tray.
• We strongly recommend that you create an exclude rule in your antivirus software so that it does not scan any files located under %LOCALAPPDATA%GoogleGoogle Apps Sync.
For more information, go here:
• G Suite Sync User Manual
• What's different and won't sync between Microsoft Outlook® and G Suite
• Read the Frequently Asked Questions
• Tools for administrators deploying G Suite Sync
• How to get help
Thanks for using G Suite Sync,
The G Suite Sync Team"
Userlevel 7
Be careful about thinking it's "Just Marketing" when there's a tech person involved who understands kernel drivers.
 
You are anything but "uncommon" in the sense of that. What is "Common" in that wording is much more meta than simple coding. When I worked at Webroot, coders made up a surprisingly large amount of our user base based on the presence count of compilers on Webroot protected systems. (It's literally a number, mind you, so don't go shouting privacy foul. We could say we want to know how many systems a specific compiler is on, and it would give us nothing but  a count. We -must- know the hash of the compiler executable in order to look it up as well, so random files cannot be found.)
 
"Sometimes visual studio is unable to write the new program to disk because the older version is being inspected by the anti-virus."
" I just had a pop-up from outlook about an email attachment that was "open by another process".  The attachment was a word document I had written the night before and the other process could in theory be some virus that snuck past webroot, but my suspicion is the webroot was inspecting it. "
 
Great news! It's not Webroot inspecting. Webroot doesn't do blocking I/O. It does not open files for exclusive access when just scanning. Have you checked whether you have Windows Defender running? It does.
 
As a coder, you should be familiar with stream (including file) open commands. You pick the file, you pick the mode, you get a handle.
 
In Visual Studio, unless you're writing multi-hundred-megabyte files, Webroot is generally finished scanning it in less than a fraction of a second. You can easily see its actions in Procmon. That being said, "generally" is an operative term. After all, some people have extra factors that impact I/O performance for example. In any case, Webroot gets a file handle, yes, but it's a non-exclusive, non-blocking file handle.
 
So even if something is driving your IOPS down the drain and Webroot is going to take, say, twenty seconds to scan, you can still modify the file, overwrite the file, or delete the file and Webroot will not get in the way -unless- Webroot has decided that the file matches the specifications of something highly likely to be malicious. Mind you, if it's taking Webroot twenty seconds to scan it, it would also take you that or more to write it just in I/O lag, but that's beside the point.
 
"'...highly likely to be malicious?' So it CAN get in the way!" Yep. But solving this is properly done by either looking into the fact that it's a false "suspicious" or not coding in a way that is highly suspicious. The only times I've had Webroot block things when compiling was when I was actually trying to make something highly suspicious or potentially malicious for testing purposes.
 
The "Common" conflicts: Blocking I/O. So your development is perfectly within that common behavior. Did you know that thumbnail generation in Explorer will block more than Webroot does on email attachments?
 
Take a look with procmon sometime. While I don't rule out bugs (heck, I worked to handle them when I worked there), I think you'll find that it's not Webroot getting in the way.
 
Since we both like similies:
"Every few weeks when I walk in my back door, there's a high pitched sound and the door has a hard time closing the last few inches. So I want to take the security sensor off it. Couldn't possibly be the hinge."
Userlevel 7
Hi JeffKellyWelcome to the Community Forums.Folder exclusion is already included in the Endpoint Security (Business) version, and we believe that it is on the roadmap for the Home (Consumer) version, but we do not have formal confirmation of this or when such an inclusion may occur.Regards, Baldrick
“When we designed SecureAnywhere, we did so with common conflicts and performance problems in mind, so file/directory exclusion isn’t necessary.”
 
"When we develop bridges, we do so with common loads and problems in mind, so the beams can be half the size of the competitor's bridges"
 
Perhaps unfair to make fun a marketing fluff, but still.  I am "uncommon" in that I use visual studio to develop applications which means that I am running a program that produces a program.  Sometimes visual studio is unable to write the new program to disk because the older version is being inspected by the anti-virus. 
 
On a more common note, I just had a pop-up from outlook about an email attachment that was "open by another process".  The attachment was a word document I had written the night before and the other process could in theory be some virus that snuck past webroot, but my suspicion is the webroot was inspecting it. 
 
I sense a narrow focus in which computer security is the theme, but a computer is a tool that still has to work.  My recommendation would be a "webroot-professional" for people who are not simply engaged in writing memos or playing games.
 
 
Userlevel 7
In any case, it still stands that if a large file being accessed is slowing down because of Webroot (Please bear in mind that Webroot "consuming CPU cycles" may not actually be slowing anything down in many cases, since in a lot of  low-threat situations it will only consume otherwise-unused CPU cycles), this should not be happening and the official support team will be very happy to help remedy that without forcing you to reduce your protection by turning it off.
Userlevel 1
You are missing the point. Your or Webroot's best practice is not necessarily the best for everyone. And at least some of your competitors recognize it.
But you are right. I was drawn here by subscription. And since it seems another of Webroots best practices is to shoo away former clients who tell you the reason they left, I will refrain from further posting.
Userlevel 7
If you don't use Webroot for yourself or your customers,. why are you here?
 
You can unsubscribe from the thread if the notifications drew you back. Head to https://community.webroot.com/t5/user/myprofilepage/tab/user-subscriptions and check the ones that you want to remove, then use the dropdown link above the list to unsubscribe from them.
 
In any case, a good number of very common droppers will probe for ignored directories. It's not at all uncommon for me to find the malware sitting in directories of games, video files, or other things that have been set to excluded in the other AV.
 
Lack of exclusion isn't just a "we know better than you" situation. It's an enforced  "this is best practice" situation to reduce problems for Webroot customers. For every problem that could be solved by an exclusion, there are dozens caused by the same exclusions. So it's better to solve a problem properly than to disable protection, and when I worked there, we were well aware that some folks with a desperate need for control would end up making exclusions and then getting infected and costing more than their subscription in support agent salary to fix it.
 
Webroot: "It shouldn't be slowing down access to that large file. Let's fix it so things are protected and fast."
Others: "Oh, yeah, it'll always slow it down. So you have to turn off protection to fix it."
Userlevel 1
And this "we know better than you are what's best for you" approach is the reason I moved from Webroot to another solution.
These days users that are likely to download an .exe from their email to see the kitten are protected by either Windows Defender or whatever their corporate IT installs. Neither is the target of this product. Moreover, any online instruction will tell them to disable the AV alltogether (I assume Webroot still allows that) rather than excluding a folder.
So what Webroot is left is people who know what they are doing. And they choose to keep telling power users - "Yeah this feature that every other AV has, trust us. It's bad for you. We know better."
It's an insulting attitude, but utlimately it's Webroot's choice. Just as my choice is to go with a solution that gives me and my customers full control over what AV does and does not do.
Userlevel 7
@ wrote:
During operations on large files Webroot contributes a significant portion of the CPU overhead. If the customer believes these particular files are safe they should have the option to exclude them from protection and un-needed overhead.
 
Normally it doesn't though. 
 
I can perform operations on 280GB raw video files without a peep from Webroot.
 
So if it is, then something else is wrong that should be fixed.
Otherwise it's like cutting off your nose because it itches. The side effects suck.
 
So if you're encountering that and it's not an exercise in theoretical questioning, contact support and they'll be happy to help you resolve it.
 
As for the further portion of the thread...
 
Businesses are under a "YODF" style contract when they exclude a directory. They fully understand and generally have competent legal teams that will say "Yeah, we did that, our bad." if they screw up by doing that. They have professionals who are able to calculate the risk and benefit balance.
 
Individuals don't have professionals. They'll happily follow directions from a web site to turn off their AV or "exclude this folder" so they can see this "cool video of a kitten" their friend definitely  sent them. They aren't able to understand YODF as well, and so they just don't get a stick to hurt themselves with. Remember: As long as it's a feature request that's not implemented, it's not implemented, and that's the way it should be.
Userlevel 7
I do not think that anyone would dispute this observation. ;)
 
On a personal note...it is my understanding that Webroot have a roadmap/plan forwhat they will introduce into the product...and we just have to assume that this has not figured highly in those plans, most likely because their top aim is to keep up the protection levels of WSA.
 
I, again personally, would prefer top notch protection/less features rather than more feature/potentially lesser protection...but as I said...that is just my personal preference. :D
 
 
Userlevel 1
As the OP, I would like to note that it took Webroot 5 years to reach this conclusion.
Userlevel 7
Hi boyeld
 
There are a lot of users who agree with the result that there is a Feature Request open (see HERE) asking for this. It is currently available in the Endpoint Security/Business version and therefore we are hopeful that it will be implemented in the Home/Consumer version at this time soon.
 
Regards, Baldrick
During operations on large files Webroot contributes a significant portion of the CPU overhead. If the customer believes these particular files are safe they should have the option to exclude them from protection and un-needed overhead.
 
Userlevel 7
That was worded quite nicely @ .  It's a shame to see you go.  All the best!
Userlevel 7
Badge +58
Whoa Sounds like I wish i would...could...should of if I could! Happy to have you with us anyways @
Userlevel 7
Moved on to Bigger and Better Things™. ^.^ Talk to Webroot if you want them to try to recruit me back. XD
Userlevel 7
Badge +58
I'm sure I'm out of my League but I'm learning a lot here and you sure you want to be Retired @ ?
Userlevel 7
@Um.  Meh. :)
 
@That's not the case, thankfully. Exes, DLLs, SYS files, etc... all need to be determined. Inventory systems, especially customized ones, would be a set I'd definitely expect to do badly with monitoring. So a quick discussion with support and the threat research team is your best bet.
 
What I'm pointing out is that the problem is not what people think the problem is. I am not saying the problem is not there. I am saying there's no need to call for amputating the leg when there's a splinter in the finger, so to speak.
 
Contact with support gets threat research on the task and things are handled within a few minutes to a few hours (usually on the <30 minutes side).
 
I also specifically refrained from mentioning it before due to the length of the prior post, but threat research has the ability to help with applications that update themselves daily or even hourly. (Bear in mind, application data updates don't cause a problem. Only when applications have programming updates. But that is common enough too.) At least they did when I was working there. They can make special rules specifically for a company that affect that company only. So effectively, they create the effect of "ignoring" the executable folder for the target application. Yes, this technically opens up the potential for a dropper to target that folder, but two things factor in: The first being that hard positives are still detected, so known malware is still caught even in the "excluded" folder. The second being that it simply stops monitoring them as Unknowns within the scope of the company in question, so if something or somebody tried to use that to get a false negative (get a virus marked as good), it wouldn't work.
 
As for "That Topic" that you are fond of pointing out...
The very first requestor is talking about DB files (not-affected and Webroot doesn't lock files while scanning them anyway), server shares (Not-affected as Webroot doesn't scan them unless explicitely instructed to anyway), and files executed from server shares (Yes, they are scanned when they are executed, but not locked, so there is no performance issue). So far every single post on that thread are from people who do not understand how Webroot operates and are calling for solutions based on the operation methods of other antivirus software. It's like people who have never seen a motorcycle before complaining that their motorcycle needs two more wheels on it and a steering wheel and gas pedal and brake pedal because "cars need to have those things!"
 
The proper solution is not "Spend hours deploying something else".  The proper solution is "Spend seven minutes contacting Webroot Enterprise Support, then have it fixed within half an hour usually, and definitely faster than it would take to deply something else."  As a support person yourself, you should know that you should describe symptoms and ask for the appropriate solution, not prescribe your own solution. You are an expert at the software you work with and the things you do. Others are experts at Webroot software.
 
Remember: Those seven minutes you didn't want to spend to get a proper solution in this case are leaving you running something that won't detect a new virus for several days or more (and statisticaly won't detect 55% of all the malware out there).
Kit,  thank you very much for the time you've given on this topic...especially given the detail and  depth  provided.   The absolute first thing that I always do is exclude all of my applications from the realtime monitors, then those paths/folders suggested by Microsoft and my software vendors.  I have several large companies who only have one thing in common- they all have a particular inventory system on their terminal servers. For example, this month we simply attempted to move from Bitdefender to Webroot...that's all....and the software vendors are claiming that Webroot is causing various problems.  The executables are excluded already (from day1).  So, as per your write-up, we can safely assume there is nothing more we can do since it's all pointless anyway, and Webroot knows everything already and therefore explicitly excluding our dataset would we a waste of time, etc.
 
As such, I don't think it's necessary to waste anyone's time on this any further.  I  think that all of the enterprise/corportate users in the biz forum should read your posts  Kit, as it was very enlightening for me. It lets us know that the only thing we can do is exclude  the apps...the data is already ignored by default,  as WR knows all.
 
There are other problems I've  been experiencing that simply won't work well with this sort of approach.  There are commercial applications that update themselves very regularly (at least once a day) in much the same way that an AV product does.  Often the application itself is updated as well.  (sometimes to the tune of many binaries).   In the end, it is a solution that is needed, not a dissertation, therefore, we'll see what support can offer and for now we've already moved our servers over to ESET and the problems we were having have subsided.  Problem solved.  Thanks again to all, and I'd suggest directing the security experts and software developers that can't get it to work (see THIS TOPIC   in the biz forum to the wealth of information disclosed here on the topic...
Userlevel 4
That should be the detailed description of the official product! Great and informative!

Thank you.

Reply