It appears I am confused about the difference between a 'deep' scan and a 'full' scan. A comment by JimM in another thread is shown below. The online help is ambiguous.
I was under the impression that if I chose 'deep' scan and pointed WRSA to a selected directory tree, it would grind through every file and archive and unpack/unroll archives of archives files to the end leaf. But it appears that I need to select 'full' scan in order to accomplish a thorough scan.
-- Roy Zider
Quote by JimM in another thread:
That is because a deep scan only scans potential threats. There are some files on your computer that can be ruled out as threats and are therefore left out of the scan by default. You could run a full scan if you want to scan every last file, but that isn't advised since you are already protected by running the intelligent deep scan and via protection through the shields, which operate in real-time.
Best answer by KitView original
Thank you for your continued interest in resolving your technical issues using this forum.
There is a section of our user guides that is devoted to scanning.
This is a copy / past of the relevant section;
Webroot SecureAnywhere allows you to select several types of scans:
• Quick. A surface scan of files loaded in memory. This scan runs quickly, but may miss
some types of inactive malware that launch after a system reboot. Note: If the Quick scan
misses an infection, the main interface remains red until you run a Full or Deep scan.
• Full. A scan of all hard drives. This type of scan is helpful if you frequently switch
between system partitions or you have several programs that have never been scanned
• Deep. An analytical scan that searches for all types of threats, including rootkits and
inactive malware. This is the default scan that runs from the main panel or system tray.
• Custom. A customized scan of files and folders.
A Link to the full User Guide page found on the Webroot Support site can be found here.
--- Roy Zider
This is the normal scan mode. The agent inspects system configuration information (Registry and file locations, running processes, loaded modules, etc) to determine what is loaded into memory, and what definitely will or is likely to load into memory during normal computer use. These files are then initially scanned by generating an MD5 hash of the full file and submitting it to the cloud system.
If the item is known Good, this data is cached and no further action is taken.
If the file is returned as known Bad, it is inspected more deeply, interdicted if currently in memory, and the cleanup engine is brought into play to start keeping track as the scan continues.
If the file is reported as Unknown (which will also occur in some cases with more-rare files that are marked good in the cloud database), it has further information gathered about it and submitted to the cloud. At points further in the scan or upon request from the cloud in response to information requests on Unknown files, the file may also be pseudo-executed in protected memory space for deeper examination. This extra information, including from the pseudo-execution, may result in a Good or Bad determination from the cloud based on cloud heuristics, or a heuristic determination from the agent itself, in which case one of the above Good or Bad results occur.
A deep scan specifically targets only things that are running, and that definitely or probably will run. The remainder of the system is considered inert, since the contents are not active or poised to be active. This remainder is evaluated if and when it becomes active or primes to become active, via on-access scans, process interdiction, etc. This portion is handled by realtime shields.
By cutting out the evaluation of inert data, the scan times in the logs I have reviewed from you are about 3-4 minutes for deep scans, and the system is still fully protected. If a threat was "missed" because it was not part of the targeted area, it isn't going to run anyway, and it does nothing when just sitting as bits on media. In the event the threat is read or attempts to execute machine code, it's scanned at that time and caught at that time.
This is an inventory of the files against the database. The file is hashed into an MD5 in full. Archives are extracted and their contents are hashed as well. Archive extraction follows limited recursion policy. The MD5 hashes are submitted to the cloud database and returned as "Good", "Bad", or "Unknown". Unlike a Deep scan, Unknown cases are not inspected more deeply. In fact, in most cases, the information on the status of files outside the Deep area expires from the local cache before the files are ever examined again in normal computer use. If infections are detected outside the Deep zone on a custom or full scan, cleanup is performed on a basic level (Deletion/Quarantine) rather than based on journalling and activity evaluation. However the secondary scan will also run a custom or full scan, and take deep-scan extra evaluation action on unknowns is this is decided to be warranted by the agent or the cloud. (In retrospect, this is potentially the cause of the continued CPU usage after the scan in the other message if the agent was handling cleanup and a secondary scan in the background).
A Custom or Full scan is only recommended for scanning resources that are accessed via routes that bypass the computer the agent is installed on and accessed by systems that may not have an agent installed on them. For example, network shares on a server with no WSA agent that may be accessed by machines with no WSA agent. Or, also for example, scanning a USB drive that will be moved to another computer with no WSA agent on it.
For normal consumer computer use, the deep scan is enormously faster, substantially more efficient, and much more effective at protecting that specific computer.
Does that answer the question better? :) Also, if I was unclear in parts, or if his generates any further questions, please let me know.
1) Open up the console
2) Click on the gear in the top right corner where it says Advanced Settings
3) On the Scheduler tab, make sure that "Perform a scheduled Quick Scan instead of a Deep Scan" box is UNCHECKED.
Jeff is correct but you can also get more granular control of scans by:
1. Click on the gear/cog symbol to the right of the PC Security tab, in the main console
2. In the new p.anel displayed (which should be entitled Scan & Shields) look in the bottom left hand corner for bottom marked 'Custom Scans'.
3. Click on that to gain access to options to allow you to identify where scans run and what level of scan you require on a adhoc basis.
Hope that helps?
Take a look at this article and see if that sheds some more light on this topic.
A Rootkit is defined as "a set of software tools that enable an unauthorized user to gain control of a computer system without being detected."
Yes, "other threats" will include virii (or viruses, if you prefer).
A deep scan examines running processes, performs heuristics and behavioral studies and primes unknowns in a sandbox. It also checks differences between raw, kernel, and user views to check for rootkits and looks for currently-running processes that are doing something other than what their initial code implies they shold be doing.
A full scan only gets the hash signature of the files and compares it to the online database. So, a full scan is pretty much the equivalent of other AV programs in the concept of comparing signatures.
Yes, it can.