Anti-virus is required by all our regulatory standards as well as good sense. The mechanism of achieving it is obsolete in a way that it is just weird that we don’t notice.
If we actually read the regulatory standards (PCI, HIPPA FISMA, etc.) they say we need to use antivirus technology and keep it up to date. I have not found a place anywhere in the body of regulations that says what technology we need to use for the antivirus. As an IT security community we owe it to our integrity to actually read the compliance standards we are adhering to.
Back in the dark ages in a place called Dr. Solomon’s Software (I loved that job) we had really good signature AV that we updated once a month. It took care of the Trojans and Word macros that we saw back then and the update fit on a floppy (remember floppies?) The total resource utilization was around 30k of memory even on our Win 95 machines that was a trivial amount of resources.
At the time of this writing My Virus Scanner gets 68 M of ram and between 100 and 500 k a day update and full update is more than 100M, this is after a major engineering effort just lowered it to 60M late last year. DATS are massive and just getting bigger. Updating them is a massive and cumbersome undertaking. The latest iteration has the AV client communicating on UDP on a near continual basis. And we think this is a good idea why? Regardless of the vendor we choose AV management has become a full time job
Signature AV also has dozens of black eyes on its success record. Everybody I know in IT (with a single exception) has outbreak stories. Everybody has had their signature AV fail to protect them even after massive efforts
So Signature AV is like having a club bouncer at a sheik party that only stops people who the bouncer recognizes as bad guys, or looks a lot like bad guys (heuristics). Wouldn’t it be better to just have a guest list, and only allow people you know and people with an invitation?
The most surreal reason that Signature AV is still alive is all the alternate technologies (buffer overflow Whitelist, Firewall, VDI, Network Control) are lower resource utilization and lower touch as far as the amount of maintenance time and management framework and can be operated a lot cheaper
These days I am leaning toward Dynamic Application Whitelisting, things like Solidcore, CoreTrace or Savant. Everything is either trusted as known or has a hash (SHA-1, or MD5) or a Certificate to introduce it. All other apps (worms, Trojans, virii, etc.) are stuck at the door. This way users can still install ITunes (if we let them) but we don’t have to worry about them “installing: the video of Osama bin Laden’s execution Video.
Another good idea is have the employees work on a VDI with all the data stored n the SAN on the far side of a good nIPS. The VDIs are destroyed at the end of each use so there are literally no meaningful ways for the users to infect the data or the compromise the work asset (VM). The host for the VDI we just put our Application While listing on and protect from key loggers and screen scrapers.
Buffer overflow protection is good but is a percentage win. Most threats these days use a social engineering vector. Host firewalls are a kind of throwback technology; it is also a percentage win but does nothing about threats with a vector of user spoofing or Social attack. Whitelisting web pages has become mature as well. Browser plugins (thinks like Site Advisor) that block users from getting to bad sites is a nice incremental win against social engineering.
In an innovation industry like ours why exactly can we not kill of the 1992 technology with an arm load of better options?
No comments:
Post a Comment