ISO27k ISMS metrics

Information is clearly a valuable yet fragile corporate asset that must be protected against a wide range of threats. Protecting information is complicated by its ubiquity, plus its intangible and ephemeral, dynamic nature, on top of which the information risks are also constantly changing. Furthermore, information risks have to be managed alongside all other risks facing the business, of which there are many. Information risk management is a tough challenge, made still harder if management lacks sufficient, relevant and reliable information concerning the status of information risk management activities, processes, information security etc.
 
"What should we be measuring?" is a common refrain, along with "What are the most common security metrics?". At face value, these are perfectly reasonable and sensible questions. However the first is impossible to answer without knowing more about the organization's situation, while the second is trickier still: scientifically speaking, we simply don't know what security metrics are in use ... and even if we did it would not help much since organizations differ in their:
  • Industries and market segments (defense, retailing, banking and IT/business services, for instance);
  • Size and hence scale and complexity (some of the metrics suitable for a massive global group are going to be materially different to those for a mid- or small-sized organization: they are not just bigger i.e. more of the same);
  • Information risks, both in absolute terms, in relation to other risks of concern (strategic risks, finance risks, technology risks ...) and relative to management's risk appetite or risk tolerance level;
  • Information security arrangements, including requirements and constraints imposed upon them e.g. by laws, regulations and contracts (compare, say, a hospital driven by privacy and HIPAA, a bank subject to the banking regs, and a PCI-compliant online retail business);
  • Maturity in respect of information risk, security and metrics. 
In an opinion piece for CSO, Pete Lindstrom wrote:

We are going to work with metrics on:

  • IT assets (number of users, devices, servers, apps, etc.)
  • Usage activity (sessions, flows, messages, etc.)
  • Process controls (user account create/modify/delete; vuln detect/patch, incident detect/respond, etc.)
  • Real-time (inline) controls (antimalware, firewall, email security, etc.)
  • Incidents
Here is a good core set of board metrics that provide strategic insight into the enterprise cybersecurity program:
  • Cyber risk: the percentage of inappropriate usage activities out of all usage activities
  • Cybersecurity efficacy: percentage reduction in cyber risk provided by the real-time cybersecurity controls
  • Cyber exposure: average number of usage activities per IT asset
  • Cyber resilience: average number of real-time controls applied for each usage activity
  • Risk aversion ratio: the willingness to accept productivity impairment (e.g., password failures, false positives) compared to the malicious activity allowed or denied (true positives plus false negatives)

In addition, we need to factor in costs and value. After all, financial information is the lingua franca of the business world:

  • Loss to value ratio: spending on cybersecurity including incident losses compared to financial value provided by IT assets.
  • Control cost per IT asset (probably application): allocated costs of cybersecurity controls by IT asset
  • Risk reduced per unit cost: financial value of reduced risk compared to total cybersecurity spending

Information risk management process metrics

Processes are normally measured using criteria such as their efficiency and effectiveness. The efficiency of risk management can be measured in part by the quantity and quality of resources (mostly people) involved in it, but how could we tell if the quantity or quality is too high or too low? Measuring effectiveness should tell us how well information risk management is working in practice, implying whether we might need to adjust the resourcing up or down …

The effectiveness of information risk management can be measured using metrics such as:

  • Demonstrated performance and activity levels e.g. the number and variety of distinct information risks currently being managed (perhaps the count or proportion of risks in the corporate Risk Register which are information risks); the amount or depth of analysis and reporting for each of the managed risks; the amount of change in the Risk Register during the year etc.;
  • Process maturity metrics – see below for an example;
  • Purely subjective measures e.g. a survey of opinions on the usefulness or value of information risk management, perhaps contrasting the opinions of different groups such as managers and information risk professionals;
  • Benchmarking of the organization’s risk management activities against good practices in the field (perhaps using standards such as ISO 31000), preferably using experienced independent consultants or auditors to assess, measure and report, with recommendations for improvements; 
  • A “nasty surprises” metric: how often in the reporting period has management unexpectedly received bad news, such as information security incidents/breaches/compromises in areas where they thought the information risks were well under control? Offset that by the number of times management has received good news or no news from areas that appear to have been doing just fine. Moderate both numbers according to the level of risk and the costs of incidents.

Information classification metrics

Complexity is an important characteristic of an information classification scheme.  Dissimilar information assets are more likely to be lumped inappropriately together if there are too few classes or classification levels, while overly complicated schemes are more difficult and hence expensive to use properly, with the risk again that some information will be wrongly classified and perhaps improperly secured. A useful metric is therefore the total number of categories – normally the product of the number of levels at each classification criterion e.g. a scheme with four confidentiality classes, two data integrity classes and three availability classes has 24 distinct categories. This metric might be used to compare or benchmark different schemes: within reason, fewer is better.

The proportion of information assets in each category is another interesting metric. The highest level categories would normally be expected to hold fewer items than the lower ones in order to focus additional attention on the highest risks and ensure the appropriate security controls are applied. The simplest way of measuring this metric is by manually sampling and counting classified items periodically.

Baseline security metrics

The breadth and quality of the organization’s baseline information security controls are typically assessed and measured through a gap analysis, review or audit against recognized good practices using information security standards such as NIST SP800-53 and ISO/IEC 27002. Gaps or weaknesses in the controls should be evaluated in relation to the importance of the corresponding information risks that are evidently not being addressed: if the risks are of little or no concern to the organization, the corresponding controls may be unnecessary. However if the risks are relevant, control gaps or weaknesses almost certainly need to be improved.

While a simplistic compliance metric might simply report whether or not the recommended controls are in place, more sophisticated metrics might take account of factors such as the effectiveness and suitability of the controls, their changing status/maturity etc., and most of all the amount of residual risk in each area.

Business enablement metric

Information risk management and information security can be viewed as business enablers, going beyond ‘not getting in the way of business’ towards ‘facilitating the exploitation of new business opportunities’, suggesting metrics such as the degree of alignment between Information Security and business departments when developing new IT systems, business processes etc., particularly in connection with new products and markets. This metric could be assessed by project and departmental managers on a sliding scale ranging from ‘totally misaligned, working at odds with each other’ up to ‘totally integrated, a seamless whole working to common goals’.

Compliance metrics

A cluster of information risks arises from the need for the organization and third parties to comply with various legal, regulatory, contractual, policy and ethical obligations concerning information security, privacy, intellectual property rights etc. Possible metrics in this area include:

  • The number and severity of compliance issues or exceptions raised during the reporting period by various stakeholders such as auditors, regulators, business partners, customers, prosecutors etc. concerning information risk-related obligations and expectations; 
  • The number of exemptions to corporate information risk and security policies, standards etc. authorized/granted by management; 
  • The net value i.e. benefits (e.g. avoided fines, successful business relationships, strategic advantage) less costs (e.g. expenditure on ongoing compliance activities, meetings, reviews, audits, expert advisors etc.) of information risk and security compliance efforts.

Information risk maturity metrics

Metrics can be developed a set of markers denoting scoring points on various factors or parameters that, collectively, give a good indication and percentage score of the organization’s overall maturity in this domain. The scoring table is straightforward to adapt and maintain as the requirements and best practices change, making this metric a good candidate for regular (e.g. annual) assessments of the organization’s information risk management practices.  Furthermore, low-scoring rows clearly suggest areas where things ought to be improved, while high-scoring rows are good news, a rare commodity in information risk and security management!


No information risk management

Weak information risk management

Decent information risk management

Strong information risk management

No IRM or information security function as such, in fact nobody doing the job

At least one IRM or information security person on the payroll, but hardly enough to constitute a ‘function’

IRM or information security management function in operation with more than barely adequate resourcing

Excellent IRM and information security management functions, fully staffed with highly qualified, experienced, competent and motivated professionals

No risk management metrics at all, and nobody cares

Poor/incomplete risk management metrics

Decent risk management metrics including some specifically on information risk

Excellent risk management, information risk and information security metrics, well-established and highly valued by management





























 

Risk metrics

The two-dimensional Analog Risk Assessment (Probability Impact Graph) graphic is a striking visual metric, a stimulating way to measure and consider the organization’s information risks.

[Although we have identified a few typical information risks on the graphic, they are merely intended to demonstrate the approach and get you started.  You may well disagree with our analysis, and there are many other information risks not shown.  Building and discussing the graphic from scratch is a worthwhile approach for an information risk workshop, with the final graphic being more than just a record of the meeting: it's an excellent work product with real value for risk management and security operations.]

The risk-control spectrum is another visual way to rank and compare information risks, indicating the corresponding range of controls to address the risks.

Naturally, I will offer PRAGMATIC scores for the example metrics but please don't take them as definitive: I assessed the pros and cons of each metric as if they were being scored by a fairly large commercial organization, one sufficiently mature at information risk and security to be implementing or running an ISMS and hence seriously interested in security metrics. YMMV (Your Metrics May Vary).

Popular posts from this blog

Pragmatic ISMS implementation guide (FREE!)

Two dozen information risks that ISO forgot

Philosophical phriday - compliance risk

ISMS internal audit priorities

Reading between the lines of ISO27001 [L O N G]

Passionate dispassion

45 ISO Management Systems Standards

Philosophical phriday - a noncompliance ramble

Adaptive SME security Crowdstrike special