Thursday 13 October 2016

There must be 50 ways ...

Over on the ISO27k Forum today, a discussion on terminology such as 'virus', 'malware', 'antivirus', 'advanced threat prevention' and 'cyber' took an unexpected turn into the realm of security control failures.

Inspired by a tangential comment from Anton Aylward, I've been thinking about the variety of ways that controls can fail:
  1. To detect, prevent, respond to and/or mitigate incidents, attacks or indeed failures elsewhere (a very broad overarching category!);
  2. To address the identified risks at all, or adequately (antimalware is generally failing us);
  3. To be considered, or at least taken seriously (a very common failing I'm sure - e.g. physical and procedural control options are often neglected, disregarded or denigrated by the IT 'cybersecurity' techno crowd);
  4. To do their thing cost-effectively, without unduly affecting achievement of the organization's many other objectives ("Please change your password again, only this time choose a unique, memorable, 32 character non-word with an upside-down purple pictogram in position 22 and something fishy towards the end, while placing your right thumb on the blood sampling pricker for your DNA fingerprint to be revalidated");
  5. To comply with formally stated requirements and obligations, and/or with implied or assumed requirements and expectations (e.g. 'privacy' is more than the seven principles);
  6. Prior to service (flawed in design or development), in service (while being used, maintained, updated, managed and changed, even while being tested) or when retired from service (e.g. if  they are so poorly designed, so tricky to use/manage or inadequately documented that they are deprecated, even though a little extra attention and investment might have made all the difference, and especially if not being replaced by something better);
  7. As a result of direct, malicious action against the controls themselves (e.g. DDoS attacks intended to overwhelm network defenses and distract the analysts, enabling other attacks to slip past, and many kinds of fraud);
  8. When deliberately or accidentally taken out of service for some more or less legitimate reason;
  9. When forgotten, when inconvenient, or when nobody's watching (!);
  10. As an accidental, unintentional and often unrecognized side-effect of other things (e.g. changes elsewhere that negate something vital or bypass/undermine the controls);
  11. Due to natural causes (bad weather, bad air, bad hair - many of us have bad hair days!);
  12. At the worst possible moment, or not;
  13. Due to accidents (inherently weak or fragile controls are more likely to break/fall apart or be broken);
  14. To respond adequately to material changes in the nature of the threats, vulnerabilities and/or business impacts that have occurred since the risk identification/analysis and their design (e.g. new modes or tools for attack, different, more determined and competent attackers, previously unrecognized bugs and flaws, better control options ...);
  15. Due to human errors, mistakes, carelessness, ignorance, misguided action, efforts or guidance/advice etc. (another broad category);
  16. Gradually (obsolescence, 'wearing out', performance/capacity degradation);
  17. Individually or as a set or sequence (like dominoes);
  18. Due to being neglected, ignored and generally unloved (they wither away like aging popstars);
  19. Suddenly and/or unexpectedly (independent characteristics!);
  20. By design or intent (e.g. fundamentally flawed crypto 'approved' by government agencies for non-government and foreign use);
  21. Hard or soft, open or closed, secure or insecure, private or public;
  22. Partially or completely;
  23. Temporarily or permanently (just the once, sporadically, randomly, intermittently, occasionally, repeatedly, frequently, 'all the time' or forever);
  24. Obviously, sometimes catastrophically or spectacularly so when major incidents occur ... but sometimes ...
  25. Silently without the failure even being noticed, at least not immediately.

That's clearly quite a diverse list and, despite its length, I'm afraid it's not complete! 

The last bullet - silent or unrecognized control failures - I find particularly fascinating. It seems to me critical information risks are usually mitigated with critical information security controls, hence any failures of those controls (any from that   l o n g   list above) are also critical.  Generally speaking, we put extra effort into understanding such risks, designing/selecting what we believe to be strong controls, implementing and testing them carefully, thoroughly etc., but strangely we often appear to lose interest at the point of implementation when something else shiny catches our beady eye. The operational monitoring of critical controls is quite often weak to nonexistent (perhaps the occasional control test). 

I would argue, for instance, that some security metrics qualify as critical controls, controls that can fail just like any other. How often do we bother to evaluate and rank our metrics according to criticality, let alone explicitly design them for resilience to reduce the failure risks?

I appreciate I'm generalizing here: some critical controls and metrics are intensely monitored. It's the more prevalent fire-and-forget kind that worries me, especially if nobody had the foresight to design-in failure checks, warning signs and the corresponding response procedures, whether as part of critical controls or more generally as a security management and contingency approach.

Good luck finding any of this in ISO27k, by the way, or indeed in other information security standards and advisories. There are a few vague hints here and there, a few hooks that could perhaps be interpreted along these lines if the reader was so inclined, but hardly anything concrete or explicit ... which itself qualifies as a control failure, I reckon!  It's a blind-spot.


PS  There's a germ of an idea here for a journal article, perhaps even a suggestion to SC 27 for inclusion in the ISO27k standards, although the structure clearly needs attention. More thought required. Comments very welcome. Can we maybe think up another 25 bullets in homage to Paul Simon's "Fifty ways to leave your lover"?

No comments:

Post a Comment

The floor is yours ...