Posts

Showing posts from March, 2013

Our tenth anniversary module

Image
The new “Taking chances” awareness module is about identifying, assessing and dealing with information security risks and opportunities.  Whereas information security and risk management professionals, as a breed, are generally risk-averse, the awareness materials this month acknowledge pragmatically that there are legitimate business reasons to accept some information security risks, to take chances deliberately: the trick is to know which ones to live with, and which to avoid, pass to someone else or mitigate. Animals deal with safety risks routinely at a subconscious level, avoiding extreme dangers instinctively, and learning to avoid other risks through teaching, by observing their parents and peers, or by trial-and-error: the ability to learn and so change our behavior is a vital survival skill.  In a sense, organizations also have both instinctive and learned reactions to risks.  This month’s awareness module passes-on decades of real-world experience with the manag...

Molds and parasites - new families of malware

Image
The following paragraph remains unredacted in a heavily redacted NSA newsletter from 1996 : "The most harmful computer virus will not be the one that stops your computer, but the one that randomly changes or corrupts your data over time." Malware that causes data corruption perhaps ought to be called a fungus or mold rather than a virus but I guess "virus" remains the nondescript all-purpose term preferred by journalists and lay-people alike.  Anyway, I partially agree with the statement. Compared to incidents that are as crude and noisy as completely stopping the computer, more sophisticated and silent attacks (such as those behind APTs - Advanced Persistent Threats) are more dangerous and insidious because they can continue unabated for longer.  As with a parasite that exploits its symbiotic relationship with the host, a lengthy infection starts off with the host barely even recognizing that it has been victimized. Random data corruption is a concern, for sure, bu...

Metric of the Week #50: policy clarity

Image
Information Security Metric of the Week #50: proportion of information security policies that are clear This week's worked example concerns a metric that looks, at first glance, to be far too subjective to be of any value ... but read on. Clarity is a critical requirement for information security and indeed other kinds of policies.  Policies that are garbled, convoluted, rambling and full of jargon are less likely to be read and understood by the people that ought to be compliant.  As a corollary, well-written, succinct policies facilitate reading and understanding, while also making them 'motivational' in style  encourages compliance. It's obvious, isn't it?  So how come we still occasionally see policies written in the worst kind of legalese, stuffed with obscure/archaic language in an embarrassingly amateurish attempt, presumably, to appear "official"? The suggestion for this metric involves regularly surveying employees' opinions regarding the cla...

On cryptography

Image
On Cryptography The focus on key length obscures the failures of cryptography Mar 21, 2013 | 07:39 AM |  No comment By  Gary Hinson   Light Reading   Should companies continue sinking yet more money into cryptography? It's a contentious topic , with respected experts on  both   sides  of the debate. I personally believe that cryptography is generally a waste of time and that the money can be spent better elsewhere. Moreover, I believe that our industry's obsessive fascination with crypto serves to obscure greater failings in security design. In order to understand my argument, it's useful to look at cryptography's successes and failures. One area where crypto doesn't work very well is health. We are forever trying to secure health records using encryption.  We apply the very finest mathematical and statistical trickery known to Man to scramble them beyond comprehension.  But then medics go and decrypt them in order to use them, call...

Metric of the week #49: infosec risk score

Image
Security Metric of the Week #49: information security risk score Most risk analysis/risk assessment (RA) frameworks, processes, systems, methods or packages generate numbers of some sort - scores, ratings or whatever - measuring the risks.  We're not going to delve into the pros and cons of various RA methods here, nor discuss the differences between quantitative and qualitative approaches.  We know some methods only go as far as categorizing risks into crude levels ( e.g. low, medium, high, extreme), while others produce percentages or other values.  We assume that ACME has chosen and uses one or more RA methods to analyze its information security risks, and on the whole management finds the RA process useful.    The question is: are information security risk scores produced by RA valuable as an information security metric? P R A G M A T I C Score 72 60 55 70 71 40 60 60 50 60% In the view of ACME's manage...

Risk matrix

Image
I developed this 3x3 matrix today as part of an information security awareness module about risk management.  The matrix plots the severity (or impact or consequences or costs) of various kinds of information security incident against the frequency (chance, probability or likelihood) of those same kinds of incident.  It is color coded according to the level of risk. Clearly, the incidents shown are just a few illustrative examples.  Furthermore, we could probably argue all day about their positions on the matrix (more on that below). Some might claim that it doesn't even qualify as a metric since  there are no actual numbers on the matrix.  " Cobblers " I say!  It is information that would be relevant to decision making and that's good enough for me, but if you feel so inclined, go ahead and pencil-in some suitable numbers at the boundaries of those severity and frequency categories ... and be prepared to argue with management about those numbers as well! A...

Measuring things right

A random comment of unknown origin fluttered into my consciousness today: “Made with the highest attention to the wrong detail” It set me thinking about the waste of effort that goes into oh-so-carefully measuring and reporting the wrong things, for reasons that include: Failing to determine the information actually required, and/or mistakenly assuming the nature of the inquiries (and, perhaps, reporting to the wrong audience) Naivete and lack of understanding about metrics, measurement, decision making and/or statistics in general Using certain measures simply because the base numbers and/or the charts, tables and reports are readily available (so they must be useful, right?) Presuming the need for a high level of accuracy and precision when in fact rough-and-ready indicators would be perfectly adequate (and cheaper) (and quicker) Analytical errors e.g. believing that the measured item is well-correlated with or predictive of something of interest when in fact it isn't Brain-in...

Metric of the week #48: redundant controls

Image
Security Metric of the Week #48: proportion of information security controls that are ossified Information security risks and controls change gradually within the organization.  From time to time, new risks emerge, new controls are introduced, existing controls find new applications, old risks subside and redundant controls are retired from service - at least in theory they are.  Keeping the portfolio of controls aligned with the risks is the primary purpose of the information security function and the Information Security Management System.  This week's metric is one way to measure how well that alignment is being managed in practice.  In immature organizations, security controls once accepted and installed seem to become permanent fixtures, integral parts of the corporate infrastructure.  Successive risk analyses lead to the introduction of yet more controls, but nobody ever quite gets around to reviewing and thinning-out the existing corpus of controls. ...