Solved

Exclude folders from scanning?


Userlevel 1
Is it possible to exclude certain folders from scanning?
 
Thank!
icon

Best answer by Kit 7 May 2012, 16:16

View original

176 replies

Userlevel 7
@ wrote:
During operations on large files Webroot contributes a significant portion of the CPU overhead. If the customer believes these particular files are safe they should have the option to exclude them from protection and un-needed overhead.
 
Normally it doesn't though. 
 
I can perform operations on 280GB raw video files without a peep from Webroot.
 
So if it is, then something else is wrong that should be fixed.
Otherwise it's like cutting off your nose because it itches. The side effects suck.
 
So if you're encountering that and it's not an exercise in theoretical questioning, contact support and they'll be happy to help you resolve it.
 
As for the further portion of the thread...
 
Businesses are under a "YODF" style contract when they exclude a directory. They fully understand and generally have competent legal teams that will say "Yeah, we did that, our bad." if they screw up by doing that. They have professionals who are able to calculate the risk and benefit balance.
 
Individuals don't have professionals. They'll happily follow directions from a web site to turn off their AV or "exclude this folder" so they can see this "cool video of a kitten" their friend definitely  sent them. They aren't able to understand YODF as well, and so they just don't get a stick to hurt themselves with. Remember: As long as it's a feature request that's not implemented, it's not implemented, and that's the way it should be.
Userlevel 7
@Um.  Meh. :)
 
@That's not the case, thankfully. Exes, DLLs, SYS files, etc... all need to be determined. Inventory systems, especially customized ones, would be a set I'd definitely expect to do badly with monitoring. So a quick discussion with support and the threat research team is your best bet.
 
What I'm pointing out is that the problem is not what people think the problem is. I am not saying the problem is not there. I am saying there's no need to call for amputating the leg when there's a splinter in the finger, so to speak.
 
Contact with support gets threat research on the task and things are handled within a few minutes to a few hours (usually on the <30 minutes side).
 
I also specifically refrained from mentioning it before due to the length of the prior post, but threat research has the ability to help with applications that update themselves daily or even hourly. (Bear in mind, application data updates don't cause a problem. Only when applications have programming updates. But that is common enough too.) At least they did when I was working there. They can make special rules specifically for a company that affect that company only. So effectively, they create the effect of "ignoring" the executable folder for the target application. Yes, this technically opens up the potential for a dropper to target that folder, but two things factor in: The first being that hard positives are still detected, so known malware is still caught even in the "excluded" folder. The second being that it simply stops monitoring them as Unknowns within the scope of the company in question, so if something or somebody tried to use that to get a false negative (get a virus marked as good), it wouldn't work.
 
As for "That Topic" that you are fond of pointing out...
The very first requestor is talking about DB files (not-affected and Webroot doesn't lock files while scanning them anyway), server shares (Not-affected as Webroot doesn't scan them unless explicitely instructed to anyway), and files executed from server shares (Yes, they are scanned when they are executed, but not locked, so there is no performance issue). So far every single post on that thread are from people who do not understand how Webroot operates and are calling for solutions based on the operation methods of other antivirus software. It's like people who have never seen a motorcycle before complaining that their motorcycle needs two more wheels on it and a steering wheel and gas pedal and brake pedal because "cars need to have those things!"
 
The proper solution is not "Spend hours deploying something else".  The proper solution is "Spend seven minutes contacting Webroot Enterprise Support, then have it fixed within half an hour usually, and definitely faster than it would take to deply something else."  As a support person yourself, you should know that you should describe symptoms and ask for the appropriate solution, not prescribe your own solution. You are an expert at the software you work with and the things you do. Others are experts at Webroot software.
 
Remember: Those seven minutes you didn't want to spend to get a proper solution in this case are leaving you running something that won't detect a new virus for several days or more (and statisticaly won't detect 55% of all the malware out there).
Userlevel 7
If you don't use Webroot for yourself or your customers,. why are you here?
 
You can unsubscribe from the thread if the notifications drew you back. Head to https://community.webroot.com/t5/user/myprofilepage/tab/user-subscriptions and check the ones that you want to remove, then use the dropdown link above the list to unsubscribe from them.
 
In any case, a good number of very common droppers will probe for ignored directories. It's not at all uncommon for me to find the malware sitting in directories of games, video files, or other things that have been set to excluded in the other AV.
 
Lack of exclusion isn't just a "we know better than you" situation. It's an enforced  "this is best practice" situation to reduce problems for Webroot customers. For every problem that could be solved by an exclusion, there are dozens caused by the same exclusions. So it's better to solve a problem properly than to disable protection, and when I worked there, we were well aware that some folks with a desperate need for control would end up making exclusions and then getting infected and costing more than their subscription in support agent salary to fix it.
 
Webroot: "It shouldn't be slowing down access to that large file. Let's fix it so things are protected and fast."
Others: "Oh, yeah, it'll always slow it down. So you have to turn off protection to fix it."
Userlevel 7
Be careful about thinking it's "Just Marketing" when there's a tech person involved who understands kernel drivers.
 
You are anything but "uncommon" in the sense of that. What is "Common" in that wording is much more meta than simple coding. When I worked at Webroot, coders made up a surprisingly large amount of our user base based on the presence count of compilers on Webroot protected systems. (It's literally a number, mind you, so don't go shouting privacy foul. We could say we want to know how many systems a specific compiler is on, and it would give us nothing but  a count. We -must- know the hash of the compiler executable in order to look it up as well, so random files cannot be found.)
 
"Sometimes visual studio is unable to write the new program to disk because the older version is being inspected by the anti-virus."
" I just had a pop-up from outlook about an email attachment that was "open by another process".  The attachment was a word document I had written the night before and the other process could in theory be some virus that snuck past webroot, but my suspicion is the webroot was inspecting it. "
 
Great news! It's not Webroot inspecting. Webroot doesn't do blocking I/O. It does not open files for exclusive access when just scanning. Have you checked whether you have Windows Defender running? It does.
 
As a coder, you should be familiar with stream (including file) open commands. You pick the file, you pick the mode, you get a handle.
 
In Visual Studio, unless you're writing multi-hundred-megabyte files, Webroot is generally finished scanning it in less than a fraction of a second. You can easily see its actions in Procmon. That being said, "generally" is an operative term. After all, some people have extra factors that impact I/O performance for example. In any case, Webroot gets a file handle, yes, but it's a non-exclusive, non-blocking file handle.
 
So even if something is driving your IOPS down the drain and Webroot is going to take, say, twenty seconds to scan, you can still modify the file, overwrite the file, or delete the file and Webroot will not get in the way -unless- Webroot has decided that the file matches the specifications of something highly likely to be malicious. Mind you, if it's taking Webroot twenty seconds to scan it, it would also take you that or more to write it just in I/O lag, but that's beside the point.
 
"'...highly likely to be malicious?' So it CAN get in the way!" Yep. But solving this is properly done by either looking into the fact that it's a false "suspicious" or not coding in a way that is highly suspicious. The only times I've had Webroot block things when compiling was when I was actually trying to make something highly suspicious or potentially malicious for testing purposes.
 
The "Common" conflicts: Blocking I/O. So your development is perfectly within that common behavior. Did you know that thumbnail generation in Explorer will block more than Webroot does on email attachments?
 
Take a look with procmon sometime. While I don't rule out bugs (heck, I worked to handle them when I worked there), I think you'll find that it's not Webroot getting in the way.
 
Since we both like similies:
"Every few weeks when I walk in my back door, there's a high pitched sound and the door has a hard time closing the last few inches. So I want to take the security sensor off it. Couldn't possibly be the hinge."
Userlevel 7
Hi boyeld
 
There are a lot of users who agree with the result that there is a Feature Request open (see HERE) asking for this. It is currently available in the Endpoint Security/Business version and therefore we are hopeful that it will be implemented in the Home/Consumer version at this time soon.
 
Regards, Baldrick
Userlevel 7
I do not think that anyone would dispute this observation. ;)
 
On a personal note...it is my understanding that Webroot have a roadmap/plan forwhat they will introduce into the product...and we just have to assume that this has not figured highly in those plans, most likely because their top aim is to keep up the protection levels of WSA.
 
I, again personally, would prefer top notch protection/less features rather than more feature/potentially lesser protection...but as I said...that is just my personal preference. :D
 
 
Userlevel 7
In any case, it still stands that if a large file being accessed is slowing down because of Webroot (Please bear in mind that Webroot "consuming CPU cycles" may not actually be slowing anything down in many cases, since in a lot of  low-threat situations it will only consume otherwise-unused CPU cycles), this should not be happening and the official support team will be very happy to help remedy that without forcing you to reduce your protection by turning it off.
Userlevel 7
Better, more technical explainations:

Other AV works by locking out access to the resource while it checks it.  If the game has a huge, hundred meg or gig-plus resource, it takes the AV a long time to check it, during which time the game cannot access it. That causes the game to behave poorly or crash.
 
Games may not start with other AVs because they do binary difference patches. That looks a lot like some stuff that viruses do. So other AVs lock them down or inject code into them to see better what's going on. Bad code injections (as opposed to unobtrusive ones) can cause multi-threading contention issues and Poof! Or the patching ability is blocked, so doom occurs.
 
Webroot works by monitoring unknown things and not locking things down while it scans. It doesn't care about binary patching because it can record the patch effects and undo it if the patch turns out to be bad, so it is able to allow patching and game resource reads to occur cleanly.
 
As for the System Events ticker, it is an accurate count.  The tricky thing is that Webroot itself is a manually-drawn window, so the Window/GDI (Graphics Display Interface) events increase MUCH more swiftly when Webroot is open because it monitors all events, even its own. Other windows don't get a very constant stream of data from the OS like Webroot has to to keep control for you, so their Window/GDI events are slower by comparison. You can see a jump in Window/GDI event speed when you do certain things like games that require thousands of updates per second, and you can see a difference based on how many windows are open.
 
Keep in mind that the code functions to monitor and decide on the legitimacy of each event are in the trillionths of a second, so it doesn't hurt your performance, consume CPU, or slow things down.
 
The system events ticker itself was included because originally a lot of people thought "It's not actually doing anything". So it was put there to say "Yes, I actually am doing something, KThx."
Userlevel 7
Please note I didn't say "There's definitely not a problem", I said "chances are there's not a problem". I don't expect in any way that Webroot is perfect, but I was not lying when I said that I've never seen a given case. I have seen things that Webroot misses, but also everything else missed it as well, and Webroot only missed it for about two hours, while everything else started picking it up after a week. So no, nothing's perfect. Everything sucks. Webroot just sucks less. ;)
 
It's common to throw blame around. People want to find something to blame for everything they dislike. As a good example, a commonly-used firewall had problems when it was installed alongside Webroot.  Webroot alone? No problem. Firewall alone? No problem. Firewall and Webroot? The firewall had problems. Must be Webroot's fault, right?
 
Turned out that the firewall itself was the problem. When attaching to a driver, there are two ways that the driver can require an attachment, effectively "full capability" and "limited". Network drivers generally only use Limited, but they can go into full mode, they just rarely do and when they do, they don't make use of anything other than limited commands. When Webroot is installed, Webroot's network driver stuff goes into Full mode and actually USES full mode, which is a good thing. Unfortunately the people who wrote the firewall thought "Nobody ever uses Full mode on network drivers, so why should we support it?" So their stuff got confused when they got a full mode message.
 
So could one say "Webroot was the problem"? Technically, yes. Webroot used something better that nobody else used and the badly-written firewall couldn't deal with it. But if the firewall had been written properly to begin with, there would have been no problem, whereas Webroot would have to be written badly instead of written properly to remove the problem.
 
So, the viruses...
AutoCAD viruses are malicious interpreted code in AutoLISP or VBA that take advantage of an older  version of AutoCAD (annoying that "older" means anything prior to 2013 SP1) to operate. Newer AutoCAD is not susceptible to this issue, and user attention also reduces the likelihood of being affected by this kind of infection, since it cannot do anything unless loaded into older AutoCAD. It is sadly also exceptionally trivial to hide these infections from other AVs and also to have false positives on them, so trusting any AV to detect these is not good.
 
"Baidu Malicious Code" is the Baidu Toolbar.  It's not a virus, it's just an annoying toolbar, like the Ask Toolbar and such. In many cases, AV programs will trigger on installers that are able to install the toolbar even if they default to not doing so. AV programs will also trigger on settings from the toolbar in the registry even if the toolbar itself is not there.
 
QQ malicious code - The only thing I could find about that is a False Positive from Avira on QQ.exe, QQBaseClassInDll.dll, and QQHelperDll.dll.
 
As was indicated by the Webroot Threat Researcher, without seeing the full logs of the detection from thee other program, nobody can say for certain what happened, we can only guess. I still stand by my statement that I have never seen anything detected by other AV that is actually a legitimate threat to the computer Webroot is installed on.
 
So no, no shares.  Not even a publicly-traded company anyway. But I can still say "Chances are it's not an issue" and be accurate, and I can also say "Without seeing logs, we have no way to know for sure, but here are examples of why this can happen." 😛 :)
 
Userlevel 7
I suppose then that a more detailed explanation is in order.
 
Webroot doesn't care about it unless it is code. So the data files are ignored. Yes, I understand this is speaking of things such as multi-terabyte non-relational data sets and other such wonderful things.
 
All of those data files are looked at for slightly less than a very small amount of time.  Their names are read and they are determined to be "Probably not code". The first few bytes are read from the kernel level, non-blocking, and determined to not be PE, so after that they are ignored. Even if they change, unless the first sector is modified, they are ignored. The only thing that would cause them to stop being ignored otherwise is if they are read into memory and an execution pointer is set to them.
 
The Applications, on the other hand, are code. They are examined in brief and determined to be code. Then they are hashed and reported to the cloud. If the cloud has them defined as [g]ood, they are watched for changes and rarely hashed non-blockingly, but otherwise left alone. Obviously [b]ad files are blocked from executing and removed. [u]nknown files are where the complexity starts.
 
Unknown files are subject to:
- Monitoring
- Journalling
- Pseudo-sandboxing
 
Just because something is expensive doesn't mean it was written well and is perfect. In fact, I've often found the opposite to be true. The more expensive, the less adaptive it often is, and definitely the less-common it is, so the less-likely it is for it to be known. Singe-custom apps obviously can't be known at all until they are scanned for the first time. This is where overriding to [g]ood status on the Webroot console is helpful, and the folks at Threat Research can determine to good on the Webroot side if needed.
 
Anyway, monitoring can be intrusive. The process may have a DLL injected for deeper inspection, which definitely has a potential to cause problems if the process's code is not flexible. For example, if the process forks a subroutine while sleeping another line for enough time for that subroutine to finish, then waking up and expecting the result to be present, the monitoring could have delayed the forked routine just long enough to cause the result to NOT be present. Read the start of the result for its length and get unprepped memory, then start reading far too much and fault and crash.
 
Journalling normally won't cause an issue like crashing, but with I/O intensive applications it can cause a slowdown in performance. Also, some applications are very timing-sensitive and give up on I/O and fault if they can't complete it in time. They may work at a kernel level to prioritize their operations, but Webroot works at a kernel level too and is there deeper, so forces in anyway.
 
And of course sandboxing just throws all sanity out the window for the application. Access attempts to sensitive system functions are silently dropped and just never return in many cases, or return "generic" results. Also consider such things as the identity shield and its impact. So an application may want to use ieframe.dll to render rich content and because it's not trusted yet, it is blocked from interacting with ieframe.dll successfully.  It sends "Show XYZ", then sends "Tell me what you are showing" as a very common programming step. It expects to receive "OK" effectively from the first command, but gets nothing. Perhaps then it stays in an endless loop waiting for an OK or Error! signal. Or perhaps it continues on. Then gets NOTHING...not even null...from the second command... and tries to read the result pointer... Then "Process tried to access memory at address - memory could not be 'read'." Yup.  Dead application again.
 
But once it's overridden and set to [g]ood, or determined by TR as [g]ood, all those problems go away.
 
So ignoring the data folder would do nothing, since it's actually intelligently ignored already since it's not PE. But protection is not reduced since starting execution from code hidden in data would trigger a scan of what was loaded.
 
And no, not even Webroot will detect everything on day one. BUT... the aforementioned journalling and pseudo-sandboxing and monitoring means that it has a much better chance of being detected before or as it starts to try to do something malicious. Plus the sandboxing makes data release not happen and the journalling means that once it is detected, it is precisely cleaned for that exact instance of the malware and every action it took is reversed in order.
 
The fact that Webroot knows GOOD (whitelisted) programs, and the fact that it only cares about executable code means that it can be exceptionally intelligent about what it watches.  So it's just a matter of getting the executable code that interacts with the data in question to be marked as Good.
Userlevel 7
Hi JeffKellyWelcome to the Community Forums.Folder exclusion is already included in the Endpoint Security (Business) version, and we believe that it is on the roadmap for the Home (Consumer) version, but we do not have formal confirmation of this or when such an inclusion may occur.Regards, Baldrick
Userlevel 7
Badge +35
@ wrote:
Here is Google's "strong" advice re. installation of its G Suite syncing software, to permit MS Outlook users to sync their pst files with G Suite email, etc.:
"Dear G Suite Sync user,

Thanks for installing G Suite Sync for Microsoft Outlook®. This software will synchronize your calendar, contacts, email, notes, tasks and domain's global address list with G Suite. Before you get started, there are a few things you should know about the current version of G Suite:
• Your journal entries will not synchronize with G Suite.
• G Suite Sync will initially download up to 1GB of email from the G Suite Server to your desktop. You can change this setting from the system tray menu. (learn more).
• Your initial sync can take a long time, because there's a lot of email to download. To see your synchronization status, look at synchronization status in your system tray.
• We strongly recommend that you create an exclude rule in your antivirus software so that it does not scan any files located under %LOCALAPPDATA%GoogleGoogle Apps Sync.
For more information, go here:
• G Suite Sync User Manual
• What's different and won't sync between Microsoft Outlook® and G Suite
• Read the Frequently Asked Questions
• Tools for administrators deploying G Suite Sync
• How to get help
Thanks for using G Suite Sync,
The G Suite Sync Team"
These suggestions are for "traditional" AV software. Since Webroot SecureAnywhere functions differently, exclusions like these are only needed in rare cases. 
 
-Dan
Userlevel 7
Badge +35
@ wrote:
As a coder, you should be familiar with stream (including file) open commands. You pick the file, you pick the mode, you get a handle.
 
So why am I geting locked files that are causing compliation to fail? Lockhunter reports WRSA holding the locks.
 


Please Submit a Support Ticket for this case. Shutting down WSA when compiling may be a temporary solution, though obviously not ideal. 
 
-Dan
Userlevel 7
There we go. Having the reasoning makes it easier to address your concerns. Keep in mind, you're not talking to a Corporate Stooge/Tier 1 tech here.  You're talking to a real person with 17 years in the security industry who works here because it's the best solution out there.   Yes, I frequent /. and Wilders and various others too.  You'll also find a lot of Wilders users on here.
 
For one, I have huge archives, ebook collections and other stuff that doesn't need to be scanned over and over again.
 
Not a concern.  We don't operate that way anyway, unless you're doing something silly like Full Scans regularly as opposed to the default deep scans.  Remember, we're not the Other Security Software.  We already know the eBooks can't contain threats, and we don't care about the archives.  There's a reason we've been consistently the lightest and fastest since we came out.  Simply put, the agent knows what can run, what is likely to run, and what will run.  It doesn't worry about data files that cannot contain machine code, and in the event of things such as buffer overflow remote code execution, it uses a realtime system at that time rather than trying to determine which files could possibly cause an issue.
 
Trust me, my wife used to have to set to exclude her art directories (yay for monolithic high-resolution PSDs), and she doesn't anymore because even she recognizes that the agent doesn't even bother to glance at them.
 
I also have some password recovery programs that are always mistaken for a virus and then sometimes simply deleted (especially by Bitdefender).
 
This is a trickier one.  No matter what else you use, you're stuck in a situation where "Ignore this file proactively" is a need in those cases.  Norton requires you to ignore the file too.  That being said, we -never- "simply delete".  Will we detect things as "Hacking Tools" if they can be used for nefarious purposes?  Yes.  However unless you're uninstalling and reinstalling the agent frequently, this is a non-issue.  A one-time "Put that back and ignore it" solves that issue without leaving the figurative key beneath your doormat. 
 
Unfortunately, despite my own opinions and everybody else's, we really do have to do what's best for "The majority", so your much-derided "Because it's what's good for you" actions are necessary.  While you may be able to recover from a severe infection in minutes, the majority of people can't and it's a catastrophic event.
 
I know how it works and I like to be in charge and take decisions. I don't like software telling me what I can and what I can't do.
 
Been there.  Still there.  Honestly, I get much greater granularity of control from WSA.  Even Norton has limitations.  Try telling it to use under 6MB of RAM while idle. Try telling it to "ignore the low level utilities in this directory, but please do catch any actual threat that goes in there"...  The former is done by an exclude, but the second half of the latter is not-done due to the exclude.
 
It kind of sounds like you've been doing full scans.  Don't.  That's not an order, mind you.  That's a recommendation.  Don't waste your time making the WSA agent pretend to be a slow, old, bloated security client.  It's just not necessary, nor does it really help anything.  Plus it'll cause you to have more desire for exclusions. Under normal operation with deep scans, it works faster, more efficiently, and there is no need for exclusions.
 
Anyway, it's all up to you.  Every software package will have its own upsides and downsides.  On the firewall side, I use the Windows firewall to pre-emptively block software from accessing the internet. Others use the firewall of their choice. The SecureAnywhere firewall is specifically intended to be an anti-threat firewall, not a firewall made for user control decisions.  That's why it can run along with any other firewall. 
 
I answered this because even if you went back to Norton, you probably wouldn't have responded if you weren't looking for an answer.  Oh, and if you truly decide that ditching the software is the way to go, we give a 70-day Money Back guarantee if you got it through an authorized source.
 
That being said, if you have low-level questions, please do feel free to continue to ask.  I do have answers.
 
Edit: Spell-Checking again (The original was posted from home with no time to finish completely.  *Oops*).  Anyway, one of the things I would ask you to keep in mind is that while you can see your desires and actions, we have to take into account the needs and actions of millions of users.  I think that when you said "semi-intelligent adult", you grossly underestimated your skill level.  We used to see all the people who were infected because they followed instructions on web pages that explained precisely how to exclude a directory from scanning, followed by the saving "this program you want" to that directory and running it.  If ever "This is why we can't have nice things" applies, this is an example.  And that's just one of the ways exclusions caused problems.  So we really are working hard to make all of the classic and other possible reasons for wanting an exclusion function obsolete so that a blanket exclusion is not necessary.
Userlevel 7
Hi Nathan, and welcome to the community!
 
Your question actually delves into some more technical stuff, so you might end up learning more than you ever thought you never wanted to know about file systems. ;)  However I will give the short part of the answer first.
 
Short answer:
SecureAnywhere will not scan the FTP mounted drives unless you explicitly start a full scan or a custom scan to do so, which does warn you before allowing you to start it.  Normal scans will not look at the mounted FTP drives unless something on the FTP drive is set to auto-run or is about to execute.
SecureAnywhere also will (technically) never scan anything across the FTP system ever. If you run something from the mounted FTP, it will be scanned prior to execution.  However it still generally will not be scanned from the server.
 
The first one likely covers your needs. That second one is what requires more explanation.
 
A mounted drive is generally treated as a block device. However FTP is not a block-supporting protocol.* When a file is read of the FTP "drive", it's really moved to a staging area on the local system drive before it can really be read. Generally the mount handler will do this transparently to the view of the program requesting it, so if F:some.txt is on the FTP, the program will see that it is working with F:some.txt. In reality, when the program requests some.txt, the mount handler will download some.txt to a temporary location then read it from the local disk during block handling and upload it back to the FTP server when it is closed if the contents have been changed.
 
So the good news is that FTP-mounted drives and indeed, any network drives at all are still not good reasons for us to make it possible to exclude anything.
 
After all, it doesn't matter how "hidden" the exclusion is. A threat doesn't need to see or know the exclusion or how to add one in order to take advantage of exclusions. Make a custom, inert downloader (eg, something that is NOT actually a threat and won't be caught as one.  I could do it in five minutes and I guarantee that no AV will catch it as a threat.) Then just write something that it knows should be caught by the AV to every folder out there (or make educated guesses about what may or likely will be excluded, like game folders and such) and see what copies survive because they are in excluded folders.  When it sees what survives, snag the real threat and drop it there.  Easy as pie and already done by numerous droppers, which is especially fun since no AV detects the droppers**, just the stuff they grab and drop.
 
(* FTP does have resume capability, which allows it to start a transfer from a specific byte offset, but that's about the closest to block stuff it has.)
(** Some AVs recognize the droppers heuristically based on what they drop being detected, not the dropper code itself.)
Userlevel 7
iMonitor (and others) specifically want to be excluded from scans because theey expect that AV scans will detect things they do as bad and/or impact their necessary performance.
 
With any other AV, this is a completely valid assumption. Other AVs intercept disk I/O and lock it down until they are finished scanning. This causes the program that needs to have unfettered I/O to be blocked for that time, so it has issues.
 
Webroot does -NOT- block I/O while scanning and does not redirect disk activity to slow it down.  Thus, the need for exlusions that exist for all other AV software does not exist for Webroot.
 
There is one basic rule:
If something says "You -must- do XYZ with your AV to use us" (Exclude us, turn it off to install us, etc), chances are nearly perfect that it is not the case with Webroot SecureAnywhere.Just try it while ignoring that bad advice from the other program and chances are pretty strongg it will work just fine anyway. If it doesn't, Webroot is absolutely thrilled to look at both programs, figure out where the other one failed, and make workarounds for it explicitly that do not reduce your protection.
 
After all, that is a HUGE security hole, and I would strongly expect there are threats that explicitly look for iMonitor to be installed because they know they can hang out in its folder and never be caught.
 
So best to just use it and see if it actually causes a problem.  Chances are that it won't, and on the slim possibility that it does, Webroot will just fix it anyway.
Userlevel 7
@ wrote:
Kit, thanks for the great explanation. I wonder why so skilled specialist is already retired Webrooter ;)
There comes a time for all people to move on to New Things™. Trust me, it shocked a lot of folks when I departed Webroot.  Now I'm making more money as the Technology Manager at a school and I get to put my "OMG HALP!  MAKE IT WORK!!" skills to excellent use. From security to WiFi to hardware to aquariums...  Sheesh. XD
Userlevel 7
Some good discussions in this thread! But I`ll post my thoughts, the first is that adware isnt always a bad thing! And thats a important thing to remember. Lots of "free" software is ad-supported this is very common in mobile applications. The people that write this software need to earn a living somehow!
 
Toolbars are sticky point, me personally I hate them all if I had my way I would blacklist every single one of them. However lots of people use them and they do have certain uses. I deal with a large number of tickets with people complaining that Webroot isnt removing a certain toolbar. I wont start naming the common ones but they nearly all require you to click OK a number of times before they install so you know what your getting. Now if a toolbar is deemed malicious or it installs in a sneaky way or wont uninstall we will mark it bad. However Webroot isnt the software police just because a piece of software is poorly written or you dont like it doesnt mean we should be blocking it :)
 
As for VT again you have to be very careful with its results. If I had a Euro for every time I saw an infection that wasnt in VT I would be a rich man. Also its common for a big Vendor to mark a file as bad and then people start jumping on that determination and copy said determination. VT is a useful tool but dont rely on it as a sole information on a file, use it along with other information!
 
Other AV programs detections, this is a tricky one really. Certain AV programs I will trust more than others and some are just pure junk. A program that most of us are familiar with is Malwarebytes. It will pick up registry issues and general windows issues as well as infections and this can cause confusion. So a customer will say they have 40 infections but its actually a toolbar with the 39 associated registry entries for the program! I actually like Malwarebytes but you have to know what its doing/detecting.
 
Webroot way of protecting your PC is different in that we dont scan every file on your PC like other AV`s do. This is a point that many people dont understand. Why bother scanning hundred of GB`s of data that will never be an infection? Why waste system resources and more importantly your time doing so? In order for an infection to do anything it has to run in active memory. Once it does so then we pounce and kill it. All of our data is in the cloud meaning that you dont have to download a large number of definitions every hour.
 
This is awesome but we run into an issue when somebody runs a full system scan with MSE or Norton and they find some dead infection thats been sitting in a random path for two years. Then they think WSA isnt doing its job but we are working as designed. I have about a 1gb of infections in a test PC sitting in folder just off the root of the C:. They are inactive and will never do anything unless I actually run them. If I scan the folder with WSA it will remove them all. Or if I run one of the files WSA will pop-up and remove the file (and it will probably scan that folder and remove the rest). WSA can scan your full PC if you want like a tradional AV but I never do it myself.
 
Sorry for the long post!
 
As for the samples that were posted.
 
c:usersdanieldownloadsfe4e1b5427cd214a9171ff11921e488f3fa2c0c7.exe
 
This file has been seen on 2 PC`s and nowhere in the wild from what I can see.
 
c:usersdanieldownloads????2006 ???cibaf.exe
 
Is a good file that has been seen on 6 PC`s

The badu toolbar files are bad in our database and WSA will remove them.
Userlevel 7
In cases of VMs, technically Webroot should be run on the VM.  Obviously in the case of a Mac VM, this ends up being a different process. No AV is perfect. Even the user will create security holes. No AV can stop somebody from doing something unexpected or dumb. ;)
 
So, Webroot is Extra-Light partially because it is smart enough to only scan what can actually be bad, yes. There are other things that contribute to being light though. 
 
Certain generic data is collected regarding every file that is scanned by every AV. On Other AV, this data is then compared against the local database of "Bad Stuff".  That database can get pretty big (hundreds of megabytes). It has to remain in memory or it will chew up the disk all the time instead. And each record it goes through is multiplied by each extra file it scans. 5,000,000 records times 300,000 files means a heck of a lot of comparisons. Even one file against 5,000,000 records is a mess of work comparatively. And while 5,000,000 may seem like a lot, it's a fraction of the bad stuff out there.
 
Webroot gets the data, then sends it to the servers on the internet. They have a heck of a lot more power than any one computer and they have access to somewhere like 500,000,000,000 pieces of information instead.  (Number pulled from under a carpet.  Maybe somebody with current data can chime in with how many rows are in the Enzo DB now).  So instead of your computer getting 30 pieces of generic data about a file and comparing it to 5 million things every time you access a file, it gets those 30 pieces of generic data and lets the server compare them to 500 billion things and send back the response.
 
But it's even more than that too. Other AV looks at the file and gets 30 datas. One of those datas is the hash signature of the whole file, which indicates whether the file has changed at all. So the whole file needs to be read to get that data. And every time the file is accessed by something, the AV gets those datas again and compares them against 5 million things each time.
 
Webroot reads the whole file to get those 30 datas also.  But it always gets the full hash first. If it sees that the hash is the same as what it got before, it knows that the rest of the data is the same, so it doesn't waste time collecting that data again. That's a good thing.
 
"But what about having to read the whole file to get the hash? That takes a long time!"
Yes, yes it does. But... When Webroot is running, it knows everything that touches the disk to write. So if the file has not been touched by anything, Webroot doesn't even need to get that hash.
 
And even more importantly: Other AVs are statically prophylactic only. They -must- catch the threat before it starts doing anything at all in machine code otherwise they cannot prevent the infection. That means that before something can run or be accessed, it has to be fully-checked. Thus the access is put on pause while it is checked.
 
By comparison, Webroot has a much larger buffer for what it can do to protect. Even if a threat has started running, everything it does is watched, recorded, and thus revertable. Critical access to sensitive places is blocked until things are determined to be safe, but most programs don't ever try to access those sensitive places. So from the program's point of view, it's not stopped or inhibited at all. If the thing turns out to be a threat, it's killed and everything it did is undone.  Even days later, but usually within seconds or minutes.
 
So:
 - Scan only the important things that need to be scanned to protect you fully.
 - Remember information for quick access so things don't need to be repeated over and over again repeatedly over and over again.
 - Don't do all the work of checking things on your computer. Your computer has better things to do with its CPU and disk. Check the things in the cloud.
 - Don't put things on hold unless they are critical deep access that can allow evading Webroot. Most programs don't go that deep so don't get put on hold.
 - Have a developer that prefers to write ten lines of highly-efficient code to make something happen instead of 200 lines of junk that works, but is icky. (Heck, even I did stuff like that. Our old WLogs program was about 5MB or so and had a whole bunch of weird ways to try to do things, and did them slowly, about twenty minutes or more to run. When I wrote WSALogs, I focused strongly on doing it all more efficiently and quickly and doing it all in a smaller package. Under a megabyte and usually runs in three minutes or less.)
 
WSA is a lot of looking at the things that make other AV programs heavy and finding solutions that make WSA light.
Userlevel 7
@ wrote:
My Norton subscription is running out, so I'm checking back in here to see if Webroot has come to their senses. Alas, still being stubborn it seems, even though a business user submitted a long list of good reasons to allow folder exclusion:
https:///t5/Ask-the-Experts/Re-Exclude-folders-from-scanning/td-p/7666
 
Looks like Norton can count me in for another year.
I have a list of reasons to not include folder exclusions:
A work search of client systems with a work line item of "Threat Cleanup - AV Missed - User Error - Exclusion"
 
But no problem, because hey, Norton doesn't want you to get infected, (Norton Virus and Spyware Removal Service, $99) right? So they must know best.  Webroot must be crazy to protect you like that. ("Webroot offers free infection remediation with a current subscription.")
 
Edit:
By the way, the post you like to has no good reasons at all. "It removes all these tools!" he said. "No it doesn't" I pointed out. (In fact, it still doesn't. I have dozens of tech utilities, Nirsoft, Password recovery, etc, and Webroot have never once flagged on them.) And as I pointed out a year ago, the business product is centrally-managed, so excluding the utilities once centrally will exclude them on all endpoints.
 
At this point, I really don't foresee Webroot changing their stance on this and reducing the protection of millions of people just to satisfy a few for no actually-good reasons. Give a legitimate case where folder exclusion is the best and only solution for a majority of customers and it might be revised.  Until then, enjoy Norton's computer resource usage and $99 virus removal fees if they miss something.
Userlevel 7
There is a distinct difference though between "Disable Protection", which makes everything "Red! Warning! Ohnoes! Do not do! Bad user!" as a constant versus excluding a folder, which creates no warning or advisory at all in anything and certainly if it did have an ongoing "Ohnoes!" people would get annoyed. So no, you can't really legitimately compare "Sorry, we won't let you exclude folders" with "Sorry, we won't let you turn us off or uninstall us".
 
One might then counter with "Well, then by your logic power drivers should have the right to go 150 MPH on the freeway if they want to and it's bad for the police to not give us the choice to drive on the sidewalks if nobody is walking on them at the time."
 
Many AV products ignore the printer spool by default, and many locations that use print management software specifically tell the AV to ignore the printer spool.  Would you believe how many infections Webroot catches in the printer spool directory? It's a pretty severe number, because they know that AV is often designed or told to ignore that directory.
 
In all honesty, it really does come down to a balance. But in this case, the balance does go against allowing directory exclusion. There are fewer people being negatively impacted by it than people being positively impacted by it and though the only thing you can do is "vote with your wallet" by not buying the software, the cost you impose to the company (loss of your purchase) is much smaller than the cost of allowing directory exclusions.
Userlevel 7
Even a year later, I'll still peek back to point out the following:
Data files are never scanned, not even realtime. Excluding data file folders would achieve nothing and would allow a virus a great place to hide, but it would not resolve the issues you are encountering. Also note that excluding the spooler by MD5 in Webroot is also Generally Useless™, since it's not monitored to begin with anyway.
 
I will describe the details if requested, however suffice to say that with three states (Bad, Good, and Unknown), and Monitoring, what you will want to be concerned with in the case of enterprise applications is situations where the applications in question are being monitored or are listed as unknown. If the applications undergo extremely frequent updates (Weekly, daily, etc), this can be annoying, since each new version is unknown at first, however the TR team can remedy that.
 
For normal programs that are not updated often, a quick note to Threat Research with data from an affected computer will remedy it in short order. That's also only necessary if manually adding the items from the console is ineffective.
 
In summary, the cause of crashes is going to be a program or DLL it loads being monitored due to being listed as an Unknown. Technically this is caused in all cases I have ever encountered by what can only be called faulty programming in the application that crashes. However that's only partially descriptive, since most programers can't be expected to catch unlikely cases like race conditions created in tight timing situations that normally would be unaffected by anything externally. Also intentional cases of self-protecting code that objects to DLL injection as a whole.
 
Data file folders and the data files themselves don't need to be excluded at all though, since they were never scanned to begin with. Known [g]ood files (like the print spooler) also don't need to be overridden or excluded in Webroot either, since there is zero impact on them.
Userlevel 1
As the OP, I would like to note that it took Webroot 5 years to reach this conclusion.
Userlevel 7
Hey, Dan, how did this get anything to lock? Last time I watched Webroot with ProcMon, it never did any exclusive opens or locks until it found something it explicitly didn't like (Full Positive). Did the code base change that dramatically?
Userlevel 3
Hi ljs199!
 
Welcome to the Webroot Community!
 
It is not possible to exclude specific folders from being scanned, but it is possible to run a custom scan.
 
To run a custom scan, open Webroot and select PC security from the menu on the left.
Under the "scan" tab, select the "cutom scan" link.
This will bring you to a screen where you can choose a quick, full, deep, or custom scan.
You can choose "custom" and it will only scan the files that you select.
 
I hope this information helps!
 
Thanks! 🙂

Reply