Posts

Showing posts from November, 2012

SMotW #34: homogeneity

Security Metric of the Week #34: organizational and technical homogeneity The degree of homogeneity (sameness) or heterogeneity (variation or variability) within the organization and its technologies affects its aggregated information security risks, in much the same way that monoculture and multiculture crops may face a differing risks from natural predators, parasites, adverse environmental conditions etc.   A particular mold that successfully attacks a certain cultivar of wheat, for example, may decimate a wheat field planted exclusively with that cultivar whereas it may not take hold, making little impact on a neighboring field planted with a mix of wheat cultivars differing in their susceptibility or resistance to the mold.  On the other hand, under ideal conditions, the monoculture crop may do exceptionally well (perhaps well enough to counteract the effects of the mold) where the mixed crop does averagely.   Homogeneity of technologies, suppliers, contracts etc . ...

Newton's take on security metrics

He may not have considered this at the time, but Sir Isaac Newton's three laws of motion are applicable to security metrics ...  Law 1.  Every object in a state of uniform motion tends to remain in that state of motion unless an external force is applied to it.  An organization lacking effective metrics has no real impetus to change its approach to information security.  Management doesn't know how secure or insecure it is, nor whether security is "sufficient", and has no rational basis for allocating resources to security, nor for spending the budget on security activities that generate the most value.  Hence, they carry on doing pretty much what they've always done.  They approve the security budget on the basis of "last year's figure, plus or minus a bit".  They do security compliance activities under sufferance, and at the last possible moment.   The law of inertia is particularly obvious in the case of large bodies that continue to blunder th...

SMotW #33: thud factor

Security Metric of the Week #33: thud factor, policy verbosity index, waffle-o-meter If you printed out all your security policies, standards, procedures and guidelines, piled them up in a heap on the table and gently nudged it off the edge, how much of a thud would it make?   'Thud factor' is decidedly tongue-in-cheek but there is a point to it.  The premise for his metric is that an organization can have too much security policy material as well as too little.  Excessively lengthy, verbose, confusing and/or overlapping policies are less likely to be read, understood and complied-with, while compliance and enforcement would also be of concern for  excessively succinct, narrow and ambiguous policies.  A scientist might literally measure the thud using an audio sound level meter, dropping the materials (stacked/arranged in a standard way) from a standard height (such as one metre) onto a standard surface (such the concrete slab of the laboratory floor), getting a...

More on survey metrics

Further to our last item about using security surveys judiciously as a source of metrics , today we're taking a gentle poke at the Data Breach Investigations Report  (DBIR) published annually by Verizon Business . DBIR is different to most published security surveys.  Arguably, it is not a survey at all.  It is based around the findings of information security incidents that Verizon have investigated.  To be more accurate and precise, the 2012 DBIR states (with our added emphasis): "The underlying methodology used by Verizon remains relatively unchanged from previous years.  All results are based on first-hand evidence collected during paid external forensic investigations conducted by Verizon from 2004 to  2011.  The USSS, NHTCU, AFP, IRISS, and PCeU differed in precisely how they collected data contributed for this report, but they shared the same basic approach .  All leveraged VERIS as the common denominator but used varying...

SMotW #32: asset management maturity

Security Metric of the Week #32: information asset management maturity 'Managing information assets' may not be the sexiest aspect of information security but it's one of those relatively straightforward bread-and-butter activities that, if done well, can substantially improve the organization's overall security status.   The premise for this week's candidate security metric is that the management of information assets and the management of information security are related.  In mathematical terms, they are positively correlated.  A mature, comprehensive, well-thought-out and soundly implemented approach to the management of information assets is likely, we believe, to be associated with a high quality, effective approach to security management.  Even if the correlation is not terribly strong, asset management at least provides a solid foundation for assessing and ultimately managing the organization's information risks. Conversely, weak or poor information ...

Risky metrics from security surveys

Published information security surveys can be a useful source of metrics concerning threat levels and trends, although there are several different ones, each with different methods, sizes, scopes, purposes and designs.  Big, scientifically designed and professionally executed surveys are inevitably expensive, begging questions such as why anyone would fund them and publish the results, often for free.  What are their motives?  What's in it for them? This is an important but seldom considered issue because of the distinct possibility of bias .   Bias is kryptonite to metrics.  Bias works like a catalyst: a small amount can have a disproportionately large effect on the outcome. Some published survey reports are quite obviously biased, being little more than marketing vehicles that selectively collect, analyze and portray the information largely to promote their particular "solutions" as must-haves.  They tend to be based on rather small and non-random sample...

Apollo metrics

If I had read about this a few years ago, I would probably have dismissed it as American propaganda, designed to fool the Russians into underestimating their technical capabilities: "How each channel was generated, though, is almost shockingly primitive by today's standards. The computers downstairs in the RTCC were responsible for producing the actual data, which could be numbers, a series of plotted points, or a single projected moving point. The System/360 mainframes generated the requested data on a CRT screen using dedicated digital-to-television display generators; positioned over the CRT in turn was a video camera, watching the screen. For the oxygen status display example above, the mainframe would produce a series of numerical columns and print them on the CRT. The numbers were just that, though. No column headings, no labels, no descriptive text, no formatting, no cell outlines, no nothing—bare, unadorned columns of numbers. In order to make them more understandable,...

Keeping tabs on contractors, consultants and others

A newly updated report from the Insider Threat unit at CERT concerns the information security threats arising from  T rusted B usiness P artners . Like all CERT's stuff, the report is well worth reading, not least because it incorporates case study materials from actual incidents - not a huge number I admit, but many more than I personally have investigated or analyzed.   [In How To Measure Anything , Douglas Hubbard makes the valid point that even relatively poor/limited/dubious information is valuable if it advances our understanding, for instance if we have little or no prior knowledge in that area. I believe CERT is a reliable, trustworthy source, and their reports certainly advance my limited knowledge, no question. Look past the limitations to consider their advice. YMMV but it rings true and makes good sense to me.] As described in the report, TBPs include lone consultants/contractors/temps (often working on-site) plus larger external service and outsourcing companies a...