scans every cell in a spreadsheet

  • 28 October 2016
  • 2 replies
  • 28 views

Badge +3
Hello,
 
I decided to start a scan on all the files on our file server. Nearly as I can tell, the only way to do this is manually, since data is on drive letters other than "C:".
 
Anyway, we are talking 2 TB of data, on a fast Windows 2012R2 VM. I knew it would take a while, but - the first half (about 1 TB) has now been running for more than a WEEK.
 
I see among other things that for many hours it has been bogged down in a single folder, which contains some Excel documents. I can watch the filenames go by, and I eventually figured out that what it is doing is scanning EACH CELL of each spreadsheet, apparently because they contain references to external content, like ".bin" and ".xml" (which I have no idea where that stuff is actually located.)
 
Is there a way to switch off this behaviour? Thanks.

2 replies

Userlevel 7
Badge +33
welcome to the forums Idigioia.
 
I'm trying to understand your reasoning for wanting to just blanket scan entire drives when it's really not necessary. That's the old way of doing things and is one of the main reasons why other AV vendors products are so slow.
 
Webroot can scan an entire drives contents if you want, but it's simply not necessary and will take a very long time as well as eat up resources that are best left for other functions of the OS.
 
You see, when Webroot is on a system, it will scan critical areas where it knows most malware will be located, and it will ignore the remainder of the system. If/when a program/file/process etc.. is executed from a non standard location, will Webroot jump in and take a look and remediate when needed.
 
If malware is found to be there, it will take action and perform a slightly more thorough scan as a follow up.
 
You need to look at things differently. A malicious executable sitting in a folder that's never accessed DOES NOT mean your system is infected. It's only infected if it's actually allowed to run and deliver it's payload or perform it's written tasks. That's when the agent for Webroot will kick in and do it's thing. Until then, it's just another file laying dormant.
 
Look at it this way. When you do dishes in your house you normally look in the area around your kitchen and areas where you likely eat around the house (by the tv, computer, dining room table etc..). But you never look under the sofa or beds for dirty plates. If you begin to vacuum or change the bed sheets, you might stumble across a plate and then take action. Until then it's just a plate.
 
So in the end, your endeavour is really fruitless as it is really only giving you a false sense of satisfaction as you are probably used to the old way of doing things.
 
You can always set a task to scan folders or drives during non peak hours, but really it's not needed.
 
Hope this clears and explains things better for you.
 
John
 
 
Badge +3
You are missing the fact that this is a file server for a 200 user network. It is not "a malicious executable sitting in a folder that's never accessed."
 
The 2TB of files have never been scanned by a good AV. Not only do I not want anything infected to be in there, I want to know whose folder structure (if any) had infected files.
 
Also, does this mean that when a user on a typical desktop computer (protected by webroot of course) opens these spreadsheets, that they will see the same behavior? (Every cell scanned.) I tried it, and it certainly does take these particular documents a long time to open.

Reply