PRAGMATIC Security Metric of the Quarter #1

PRAGMATIC Security Metric of the First Quarter

Having scored and discussed fourteen Security Metrics of the Week during the first three months of this blog, it seems appropriate now to take a step back, review the metrics we have discussed thus far and consider the value of the PRAGMATIC process.  

Here are the fourteen security metrics, tabulated in descending order of their overall PRAGMATIC scores.  Click any of the metrics to read more about them and discover why we scored them thus.


Metric

P

R

A

G

M

A

T

I

C   

Score

Discrepancies between physical location and logical access location

75 76 72 90 82 75 85 83 60 78%
Information security policy coverage

75 82 92 78 80 70 73 60 81 77%
Unowned information asset days

40 51 84 77 74 86 92 94 82 76%
Number of unsecured access points

95 80 90 70 85 77 45 75 55 75%
% of critical controls consistent with controls policy

83 92 80 83 89 82 32 70 35 72%
Days since logical access control matrices for application systems were last reviewed

55 80 95 30 80 85 60 70 80 71%
Number of unpatched technical vulnerabilities

80 64 80 70 80 75 25 85 52 68%
Coupling index

68 85 50 60 72 47 35 61 42 58%
Vulnerability index

74 85 71 74 60 32 46 33 19 55%
System accounts-to-employees ratio

74 67 38 39 68 42 36 83 44 55%
Corporate security culture

80 80 60 55 75 55 10 45 10 52%
% of purchased software that is unauthorized

71 51 90 75 82 35 13 20 6 49%
Security budget as % of IT budget or turnover

13 3 16 2 2 0 4 18 88 16%
Number of firewall rules changed 2 1 1 10 2 33 14 4 17 9%

In simple numerical terms, the metric Discrepancies between physical location and logical access location is the leader of this little pack which qualifies it as <cue drum roll> our first PRAGMATIC Security Metric of the Quarter.  In fact there's clearly not much to choose between the top four metrics in the table in terms of their overall PRAGMATIC scores.  The scores and hence rankings may well have changed if we had made different assumptions in the scoring, or of course if we had altered the specification/wording of individual metrics to address the issues we identified and hence altered their scores.  Furthermore, your scoring of the metrics may differ from ours due to  differences in how we each understand and interpret both the metrics and the PRAGMATIC approach.  We don't have the same experience as you, our biases differ, our presumptions and organizational contexts differ and no doubt we have different potential audiences and purposes in mind for these metrics.

That whole line of discussion is moot, however, since we are not claiming that the PRAGMATIC approach is scientific and objective.  Our top-scoring metric is not necessarily the best of the bunch under all circumstances for all organizations, just as the lowest scoring metric may be appropriate and score more highly in certain situations.  The approach simply offers a rational way to consider the value of and compare various security metrics, to elaborate on their pros and cons, to identify ways in which the metrics might be re-phrased or materially altered to improve them, and most of all to facilitate a more informed and productive metrics discussion with management.  Even if you simply use the process to shortlist the most promising from a bunch of metrics candidates, leaving the final selection to management, isn't that a worthwhile outcome?

There's plenty more to say yet about being PRAGMATIC, including ways to glean further useful information from the data in the scoring table above, but we'll leave that for the book, future blog pieces, seminars and papers.   Meanwhile, do please let us know about your favorite security metric and we'll see we make of it.  We look forward to your comments on this blog and emails, especially constructive criticism and creative ideas to make the PRAGMATIC approach even more effective.  Over.