Tuesday 30 July 2013

Hacking awareness



We have just published a security awareness module on hacking and cracking. It was an interesting challenge to write on such a technical topic without either losing most of the audience in the technobabble, or perhaps giving them the keys to the kingdom by teaching them how to hack.  We've carefully trodden a fine line this month.

I'm particularly pleased with the quiz.  We typically offer deliberately open-ended quiz questions to encourage people to discuss the topic in some depth for the most beneficial learning experience, preferably in a social setting. Having struggled for a while, a sudden burst of inspiration led me to base the hacking quiz on hacking the quiz.  The question is essentially "If you were so inclined, how would you cheat at this quiz?"  I can just picture a bunch of slightly inebriated teams having a lot of fun with that deceptively simple idea.

By the way, what do you think of our word-art padlock graphic?  It's one of the poster images in the module, created by our talented graphic artist who unfailingly turns half-a-dozen rather vague suggestions into six beautiful works of art every month. 

We're having fun with - and sometimes making fun of - information security!

SMotW #67: No. of unlicensed software installations

Security Metric of the Week #67: number of unapproved or unlicensed software installations identified on corporate IT equipment


This is a simple compliance metric, a count of inappropriate or pirated software installations discovered on the network.  Using software to audit the network, the base data are easy enough to gather once the data collection clients are in place, although reconciling the automated findings against license records is a different matter unless the organization has a strong license management system.  That in turn requires a strong culture of compliance with corporate policies and procedures concerning the correct procurement and licensing of software and updating the license database accordingly ... which is probably one of the key goals for this metric, supporting the more obvious and direct objective to crack down on unlicensed software.


When considering the merits of this metric, ACME Enterprises Inc. was not in a particularly strong position with respect to license management and management was unconvinced about the benefits of software auditing compared to the Costs (which, for them, would have included setting up the license management system).  Hence the metric's PRAGMATIC score was not very encouraging:

P
R
A
G
M
A
T
I
C
Score
58
55
82
73
86
47
64
66
17
61%

Some of ACME's managers were a bit puzzled at first about the metric's reference to "unapproved or unlicensed". The risks associated with unlicensed or pirated software are straightforward enough.  They wondered what are the dangers of unapproved software?  And what is approval in this context, anyway?  During the short discussion that ensued, they quickly came to appreciate the issue, and decided that the metric made sense, hence the high rating for Meaning.

Saturday 20 July 2013

Cyber risks up, according to Lloyd's survey

Lloyd's Risk Index 2013 is getting a fair bit of coverage in the information/IT security press since it ranks cyber risk the third most significant business risk this year, up from twelfth and nineteenth places in 2011. It is lower than the risks of high taxation (which I guess refers to the political risk of higher tax rates being introduced, since tax rates are known and have a probability of close to 1) and loss of customers (which is of course bad news for any business in terms of the impact, and is more likely when times is hard).

The following chunk caught my eye within page 11 on cyber risks:
"According to a report published in April 2013 by the Insurance Information Institute, employee negligence is responsible for 39% of data breaches, system glitches for 24% and malicious or criminal attacks for only 37%. That leaves nearly two-thirds of incidents caused by issues which should reasonably be within a business’ control."
It's not entirely clear what they mean. The first and third proportions stated are both close to one third. Possibly they consider 'malicious and criminal attacks' to be not within the control of the organization - which is nonsense. All organizations subject to 'malicious and criminal attacks' (meaning practically everyone) should be well on top of preventive, detective and corrective controls against malware, hacks, frauds, social engineering and so on. Likewise with 'employee negligence' which is very much within management's domain of influence. Various policies and procedures, plus training and awareness, plus data entry validation and other technical controls all address 'employee negligence'. It seems to me that organizations can and should address all significant information security risks, not just the two-thirds stated. Not to do so represents a governance failure.

I'm not even entirely sure what "data breaches", "system glitches" and "malicious or criminal attacks" are, in this context, although we can all guess.  Perhaps the original report from which this information was gleaned is more specific.

Anyway, the report recommends "... spending money upfront on risk management – and ensuring recommendations are implemented throughout a company – might go a long way to preventing a cyber disaster before it starts". Hear hear! It's a shame they didn't go explain what an organization ought to do to prevent "a cyber disaster" but that's no surprise given that it was a general business survey.


PS  I can't find the number of organization's surveyed in the report - a fundamental parameter, I would have thought, since it materially affects the margins of error (which aren't stated). The geographical spread does at least suggest a reasonably large survey.

Tuesday 16 July 2013

SMotW #66: organization's financial health

Information Security Metric of the Week #66: the organization's economic situation

An organization that is in dire straits, financially, is essentially forced to dig-in, concentrating its remaining resources on sheer survival.  As such, it is likely to minimize its expenditure in all discretionary areas, including some (but hopefully not all!) aspects of information security.  Cutbacks may be severe, creating a depressing atmosphere that leads to the best people leaving, hastening the vicious downward spiral.  Conversely, an organization that is riding high, financially, is likely to have its infrastructure well in hand with enough cash left over to invest in whatever people and projects management sees fit to support. Proposals to refine its information security arrangements towards best practice are far more likely to gain support in this situation, while it is more likely that the organization can afford the quality of people to make things happen.  

So, at this strategic or gross level of analysis, it is not unreasonable to surmise that there is a relationship between the organization's overall financial health or economic status and the state of its information security.  Against that background, Acme's managers used the PRAGMATIC approach to explore the possibility of using Acme's financial health as an indicator of its information security situation.  

Given that the links between economics and security are, to be frank, somewhat tenuous, the metric's score was not terribly impressive:

P
R
A
G
M
A
T
I
C
Score
72
80
10
80
80
80
61
80
79
69%

The very low 10% rating for Actionability points to an obvious concern ("If the metric was well below par, would we have any idea what to do to fix it?"), hence this particular infosec metric seems unlikely to feature in Acme management's cockpit instrumentation.  

During the PRAGMATIC discussion, however, one of the managers raised an intriguing counterpoint: if the two factors are indeed linked, wouldn't Acme's infosec status also indicate its financial health?  Might high-level infosec metrics be of value for general corporate management?  "Are we looking at this the wrong way around?" she asked.  

While the discussion headed off at a tangent on leading and lagging metrics, the CISO quietly contemplated that in those cold, dark, pre-PRAGMATIC times, this kind of creative discussion around metrics simply would not have occurred in the C-suite.  There was no common understanding about metrics, and little appetite or even opportunity to discuss their design or selection since management was forever desperately trying to make sense of the untidy heap of crappy metrics before them.  They were far too busy digging to notice the hole.

Friday 12 July 2013

PRAGMATIC Security Metric of the Quarter #5

Example Information Security Metric of the Fifth Quarter

The PRAGMATIC scores for another 3-month's worth of information security metrics examples are as follows:

Example metric P R A G M A T I C Score
Information access control maturity 90 95 70 80 90 80 90 85 90 86%
Security policy management maturity 90 95 70 80 88 85 90 82 88 85%
Number of important operations with documented & tested security procedures 95 96 91 85 95 84 62 90 60 84%
Information security budget variance 70 90 85 77 80 77 80 90 95 83%
% of information assets not [correctly] classified 75 75 97 85 90 80 80 80 80 82%
Policy coverage of frameworks such as ISO/IEC 27002 70 75 90 69 85 76 72 65 85 76%
% of policy statements unambiguously linked to control objectives 92 91 64 60 85 65 45 75 75 72%
Rate of change of emergency change requests 64 71 69 73 78 70 70 69 83 72%
Total liability value of untreated/residual risks 88 98 59 33 96 33 77 38 10 59%
Entropy of encrypted content 78 66 23 78 3 93 74 79 34 59%
Embarrassment factor 26 38 20 50 63 72 40 87 87 54%
% of security policies that satisfy documentation standards 66 47 79 45 74 38 44 50 35 53%
Patching policy compliance 66 52 55 77 19 36 11 8 5 37%

Top of the heap are two maturity metrics scoring 85% and 86%, with a further 3 metrics also scoring in the 80's.

While it is tempting to recommend these and other high-scoring metrics to you, dear reader, please bear in mind that they were scored in the context of a fictional manufacturing company, Acme Enterprises Inc.  The scores reflect the perceptions, prejudices, opinions and needs of Acme's managers, given their current situation.  Things are undoubtedly different for you.  We don't know what's really important to you, your managers and colleagues, about information security.  We have no idea which aspects are of particular concern, right now, nor what might be coming up over the next year or three.  Hence we encourage you to think critically about the way we describe the metrics, and preferably re-score them.   

Furthermore, PRAGMATIC scores alone are not necessarily a sound basis on which to select or reject metrics.  It's not that simple, unfortunately, despite what you may think given the way we bang on and on about PRAGMATIC!  The scores are intended to guide the development of an information security measurement system, a well-thought-out suite of metrics plus the associated processes for measuring and using them.  Considering and scoring each security metric in isolation does not build the big picture view necessary to measure information security as a coherent and integral part of the organization's management practices.

The book describes PRAGMATIC scoring as the heart of a comprehensive method, an overall approach to information security metrics.  The method starts by figuring out your metrics audiences and their measurement requirements, building a picture of what you are hoping to achieve.   Knowing why certain security metrics might or might not be appropriate for your organization is arguably even more important than knowing which specific metrics to choose ... but, that said, the act of gathering, contemplating, assessing and scoring possible metrics turns out to be a productive way both to determine and to fulfil the needs.  It's a deliberately pragmatic approach, a structured method that achieves a worthwhile outcome more effectively and efficiently than any other approach, as far as we know anyway.  Perhaps you know different?

Thursday 11 July 2013

SMotW #65: information access control maturity

Security Metric of the Week #65: information access control maturity

Controlling access to information - permitting authorized and appropriate access while denying or preventing unauthorized and inappropriate access - is undeniably a core concern in information security.  It's pretty much all that old-skool IT security tried to achieve in terms of controlling access to data.  Back then, the overriding concern was confidentiality.  

These days the scope of our activities is much wider.  Restricting access to information remains important, but we also appreciate the need to disclose and use information where appropriate.  A data file locked away in a high security vault is certainly confidential but in most cases there's not a lot of point denying it to third parties unless we can use it (i.e. it is available to us as and when we need it) and unless it is sufficiently accurate, trustworthy, complete and up-to-date (the integrity property).  

If management expressed an interest in this area, how would you actually go about measuring your organization's approach to controlling access to information?  Stop and think about that for a moment before you read on.  Seriously, imagine you have been asked to develop a suitable access control metric for the CISO's information security dashboard, one that will be reported to and discussed by the C-suite every so often.  Exactly what would you measure, how, and why?  

If you already have a suite of security metrics in place, go check what (if anything) you are using to measure and report access control.  Go on, it'll only take a moment.

There's a fair chance you are using numbers from the IT systems concerning technical access controls.  'Rate of detected access violations' is an example, something you can probably glean from the security logs on your servers.  Fair enough, the rate gives an indication that people are (presumably) attempting and (presumably) failing to access files, disks, memory space, IT systems, network ports or whatever.  Similarly, intrusion detection/prevention systems can automatically spew forth metrics such as 'rate of attempted intrusions' that were detected and (presumably) blocked.  Neither of these technical measures tells you how many invalid, unauthorized and inappropriate access attempts succeeded, however, if they were not detected as such.   They are rather narrow, limited metrics.  They may be of some interest and utility for information security people and systems/network managers in fine-tuning the technical access controls, but as far as higher levels of management are concerned, they don't mean much and aren't much use to manage the organization's information security risks as a whole.

Other access control metrics might relate to the processes associated with controlling access, for instance many of the activities performed by Security Administration.  The rate of provisioning of user IDs, resetting user passwords, changing access rights and so on could be measured from Security Admin's job ticketing system, perhaps even counting how many forms they process each week.  Again, the measures are perhaps of use to the Information Security Manager or Head of Security Administration but not much beyond that.

Yet another measure might be the number of system accounts held by employees - we discussed that metric a long time ago ...

... We could go on, but hopefully that's enough to give you a flavor of the variety of ways to measure access control, and the limitations of many of the measures taken in isolation.

Acme Enterprises Inc. has considered a different approach, a kind of measure-of-measures - a higher-level metric that gives senior management an overview of all the different elements involved in controlling access to information.   Specifically, they have evaluated a maturity metric.

We have described the maturity metric approach before in relation to measuring security policy management, physical security, human resources, and compliance.  The access control maturity metric assesses the organization's status by reference to a notional maturity scale for each type of control.  It is quite straightforward for someone with professional experience of a wide variety of access controls to develop the maturity scale, especially with the benefit of applicable standards and guidelines in this area.  Acme's version has 4 scoring points on a continuous scale, ranging from no control whatsoever to outstanding control (best practices, you might say).

Take the line for access control policies, for instance, one of several rows used in the access control maturity metric.  The scoring points on this line are:
  • 100%: "Access policies are formally defined for all systems by information asset owners.  The rules are proactively implemented, confirmed, maintained, monitored and periodically reviewed by a dedicated Security Administration function."
  • 67%: "Access policies and rules are well defined, implemented and maintained on most systems, including all sensitive/critical systems, with some compliance activities such as exception logging."
  • 33%: "Access policies or rules are partially defined and implemented on some systems (e.g. controlling logical but not physical access) but are generally not well maintained."
  • 0%: "There are no access policies or rules of any description."

[For the rest of the table, please refer to Appendix H in the book.]

The PRAGMATIC ratings for this metric tell a tale, as always:

P
R
A
G
M
A
T
I
C
Score
90
95
70
80
90
80
90
85
90
86%




The lowest rating, for Actionability, merits 70% because although the overall access control maturity score is not directly actionable, the detailed row-by-row scores are, to some extent.    In the managers' minds, this metric is highly Relevant to information security, implying that they feel access control is essential.  With an overall PRAGMATIC score of 86%, this is clearly a strong candidate for Acme's information security metrics system.

Thursday 4 July 2013

SMotW #64: patching policy compliance

Security Metric of the Week #64: patching policy compliance

The idea behind this metric is to compare and reconcile the actual software patching status of corporate IT systems against the corporate policies and procedures on patching and vulnerability management.  

Clearly the details of the comparison and reconciliation depend largely on precisely what the policies and procedures demand, while the assessor (metricator) may be somewhat selective in assessing compliance.  So-called vulnerability assessment tools, for instance, typically search systems for installed software, determine the versions installed, then look up a database of known latest versions to see whether the software is up to date.  The process is almost entirely automated making it quite cheap and easy to run ... but Acme's corporate policies and procedures require rather more than just "Always install the latest versions of software", such as:
  • Acme must maintain a database of installed software on all corporate IT systems;
  • Newly-released patches for software used by Acme must be assessed to determine whether they are applicable and necessary (e.g. the issues they address are causing problems for Acme, or are risks of concern to Acme);
  • Patches addressing security risks that are being actively exploited should be prioritized over those that are merely theoretical;
  • Patches addressing vulnerabilities that are exposed to the Internet or other threat groups (including internal threats and business partners) should be prioritized over those that are not so exposed;
  • Patches addressing security risks to business- or safety-critical systems should be prioritized over relatively low-risk systems;
  • Applicable, necessary patches must be checked and if appropriate tested to ensure that they will not adversely affect the operation of Acme systems, especially business-critical systems - unless the Information Security Manager determines that the risk of not applying a critical security patch outweighs the risk of it causing operational problems;
  • Patches must be reversible, in other words backups and other arrangements must be made to enable patches to be reversed-out (uninstalled) efficiently if problems appear in operation, even if the pre-installation checks and tests gave acceptable results.

Assessing compliance with a policy as complex as that requires a lot more effort than just running a vulnerability assessment tool.   

The patching policy compliance status has some significance and Relevance to security, since systems that are not being properly patched probably expose a number of technical vulnerabilities due to design flaws and bugs in software.  Furthermore, the patching status may be somewhat indicative of policy compliance in general, meaning that this metric could be used as a compliance indicator.  However, it is glaringly obvious from the PRAGMATIC score of just 37% that the metric is not favored by Acme management.  In their assessment, it is unlikely to be Timely (time spent assessing compliance might be better spent patching).  It is cumbersome (i.e. Costly) to measure, and is measured by people with a direct interest in the subject areas (i.e. it lacks Independence or integrity).

P
R
A
G
M
A
T
I
C
Score
66
52
55
77
19
36
11
8
5
37%




The low score for Meaning is interesting.  Usually, a low score here indicates that the metric isn't self-evident and would need to be explained to the intended audience or risk them misinterpreting it.  In their assessment, Acme managers considered that the metric would primarily be an operational level metric, aimed at information security and IT professionals.  As such, the fact that the metric is quite technical in nature would not be an issue since the intended technical audience can be expected to understand what it means.  However, in this case, the metric does not provide sufficient technical detail to be useful.  In particular, as currently worded, it does not indicate any kind of priority for action.  Which systems need to be patched first, and why?