I don’t like to use antiviral tools to recover from malware infections. I use a different approach.
If I have to scrub a boot drive for malware, instead of restore the drive from a backup, I would reboot using an alternative main drive, in order to more easily scrub the normally operating drive. I don’t like malware that has fused itself to the running copy of the OS being able to fight against what I am doing, so I switch to using an uninfected copy of the OS to run the machine, usually by booting from another disk. With the malware unable to run (since it is literally on the wrong disk), it becomes much easier to disinfect an infected OS.
But scrubbing a drive for malware is a surgical procedure. I consider doing this to be a last resort, even when it is made as automatic as possible by way of antiviral (AV) tools. So I try to keep things set up in such a way that I don’t have to scrub. Recovery by restoring a working environment is simpler, faster and safer.
Pre-infection monitoring on machines subject to malware is still a good idea, but once I know one of my machines is infected, I follow a different script.
I was recently motivated to look into AV tools for Mac OS X. When I found them, I quickly discovered how bad they were. I think Mac users should not rely on available AV tools, and fortunately most don’t have to. Most AV tools are installed on Macs, apparently, to help keep Macs from being used to propagate malware to Windows systems.
I think of AV technologies to be less a recovery mechanism, and more as a limited form of detection and prevention. I have avast! running, but other than the 1800 copies of numerous mouldy PC malwares it discovered in my ten years’ worth of email attachments on its very first pass, it has not seen anything exciting, much less anything that targets Mac OS X.
It is genuinely interesting to get opinions from AV tools. I don’t think relying on malware detection is a sound idea, but using them to enhance your network privacy and security might be. I feel better having designed and implemented a set of Standard Operating Procedures that produce reliable results.
The key to this approach is to keep primary versions of all files somewhere else. I don’t care what it is, if you can’t lose it, do not keep a master on your desktop or laptop. When I use or edit a file, I work on a local copy, never on a master. Every file on the machine I most directly use is absolutely expendable. I am not afraid to nuke anything on that machine.
Here are questions one needs to ask to be proactive about solving work interruptions:
While you have your development environment up and working, do you make a clean copy of it? I do.
Is it possible to completely rebuild your development environment within an hour from scratch? I can.
When you’re working in an environment subject to malware, do you isolate the environment? I do. Virtualization tools like VMware, VirtualBox, and Parallels make this easy.
Is it normal to keep one’s work in incremental fashion in a repository? Yes it is, I do.
Does your email and other business natter normally live on a separate machine somewhere? Mine is backed up; I’m migrating to a solution where this stuff lives on a private server, which is also backed up.
Once you know there’s some sort of malware eating your machine, if it only takes an hour or two to recover, using well-known and well-defined recovery paths that always work? I think most people would volunteer to do things this way, once they understand what is involved.
Not having to mess with the mildly interesting but likely intricate details of malware extraction (scrubbing), a process that might take hours or days? Priceless.