Posts

Showing posts from April, 2013

Fraud awareness module released

Image
Frauds, scams, swindles and cons involve taking advantage of victims through the use of deception, which is itself a form of social engineering.  As such, fraud definitely qualifies as an information security concern, making it a valid topic for the security awareness program.  What’s more, fraud is an inherently fascinating subject.  The deviously creative nature of fraudsters means they find surprising ways to dupe and manipulate people, processes and systems, undermining or bypassing controls that superficially appear sound. Fraudsters may exist within or without the organization, sometimes both.  Procurement frauds, for instance, often involve dishonest or coerced employees acting in collusion with external suppliers to misappropriate the organization’s funds.  Collusion between individuals is a particularly challenging concern in relation to fraud since it negates a very important form of control – the division of responsibilities between individuals. The b...

Securing the security metrics

Image
At the risk of appearing security-obsessed, I'd like explore the information security risks and control requirements that should be taken into account when designing an information security measurement system , particularly if (as is surely the aim) the metrics are going to materially affect the organization's information security arrangements.  I'm talking here about the measurement system as a whole, not just the elements and metrics within it.   Information security is undoubtedly a concern for the executive suite's information security dashboard, the metrics database maintained by the CISO, and the monthly metrics report, but I'm taking a broader perspective. It is not appropriate for me to propose specific information security controls for your information security measurement system since I can barely guess at your circumstances - the threats, vulnerabilities and impacts in relation to your security metrics, your business situation.  However, the rhetorical q...

Security metric #54: documentation of important operations

Image
Security Metric of the Week #54: number of important operations with documented and tested procedures At first glance, this week's example metric doesn't sound very promising.  The wording is ambiguous, its value unclear and its  purpose uncertain. If you were a senior executive sitting on "mahogany row", trying to select some information security metrics to use as part of your governance role, you might well be tempted to reject this metric without further ado, given obvious concerns such as : It implies a straightforward count, giving no indication of how many "important operations" remain to be procedurally documented and tested.  What are "important operations" anyway?   Is it referring to business processes, information security processes, IT processes, activities, tasks or something else?  Who decides which ones qualify as "important", and on what basis? "Documented and tested" sticks two distinct issues into the one metri...

Security Metric #53: entropy

Image
Information Security Metric of the Week #53: entropy of encrypted content Randomness is a crucial concept in cryptography. Aside from steganography, strongly encrypted information appears totally random with no discernible patterns or indicators that would give cryptanalysts clues to recover the original plaintext. "Entropy" is a convenient term we're using here to describe a measure of randomness or uncertainty - we're being deliberately vague in order to avoid getting embroiled in the details of measuring or calculating this metric. And, to be frank, because Shannon goes way over our heads. W e envisage ACME using this metric (howsoever defined) to compare encryption systems or algorithms on a common basis, for instance when assessing new encryption products for use in protecting an extremely confidential database of pre-patent information. Faced with a shortlist of products, management seeks reassurance as to their suitability beyond the vendors' marketing hyp...

PRAGMATIC Security Metric of the Year, 2013

Having just discussed our fifty-second Security Metric of the Week here on the blog, it's time now to announce our top-rated example security metrics from the past year.   <Cue drum roll> The PRAGMATIC  Security Metric of the Year, 2013, is ... "Security metametrics" <Fanfare, riotous applause> Here are the  PRAGMATIC ratings for the winner and seven runners-up, all eight example metrics having scored greater than 80%: Example metric P R A G M A T I C Score Security metametrics 96 91 99 92 88 94 89 79 95 91% Access alert message rate 87 88 94 93 93 94 97 89 79 90% Business continuity maturity 90 95 70 80 90 85 90 87 90 86% Asset management maturity 90 95 70 80 90 85 90 85 90 86% Infosec compliance maturity 90 95 70 80 90 85 90 85 90 86% Physical security maturity 90 95 70 80 90 85 90 85 90 86% ...

Security metric #52: external lighting

Image
Security Metric of the Week #52: proportion of facilities that have adequate external lighting This week's example metric represents an entire class of metrics measuring the implementation of information security controls.  In this particular example, the control being measured is the provision of external security lighting that is intended to deter intruders and vandals from the facilities.  It is obviously a physical security control, one of many.  The metric could be used to compare and contrast facilities, for example in a large group with several operating locations.  While we've picked on external lighting for the example, the metric could be used to measure almost any control. The metric's PRAGMATIC score is rather low: P R A G M A T I C Score 2 5 70 42 11 46 35 18 31 29% Why has ACME's management evidently taken such a dislike to this metric?  Its shortcomings are laid out in some detail in the book  (for instance, what does it mean by "adequate"?)...

Security metric #51: rate of IT change

Image
Security Metric of the Week #51: perceptions of rate of change in IT "Perceptions" are opinions, hence this is a clearly a highly subjective measure.  Nevertheless, it could be argued that extreme readings have some information security significance.  Rapidly changing or highly dynamic IT towards the right of the U-shaped curve  implies that those surveyed are distinctly uncomfortable with the pace of change.  ACME may perhaps be struggling to keep up with new technology, hence it may not be on top of the information security aspects, increasing its information security risks.  Conversely, slowly changing or relatively static IT on the left implies that ACME may not be investing in technology, hence it may be falling behind on information security and again may be taking risks.  In the middle ground, the impression is that those surveyed are relatively comfortable with the changing IT ... but it takes a leap of faith to equate their comfort to a low leve...

Five characteristics of effective metrics

Image
What makes a security metric effective or good?  What makes one ineffective or bad?  Can we spot shining stars among the duds, short of actually firing them off to management for a few months and watching the fallout?   It's an interesting question that gets into our understanding of metrics. Naturally, Krag and I believe we know the answers, but we're not the only ones to have expressed an opinion on this. [Before you read on, what do you think makes a good security metric?  Take a moment to mull it over.  It's OK, you don't need to tell anyone.  It's your little secret.] Following a conference presentation by Gartner's Jeffrey Wheatman, Tripwire's Dwayne Melancon wrote up what he described as " a really good list of 'Five characteristics of effective metrics '"  that had been presented by Wheatman: Effective metrics must support the business’s goals , and the connection to those goals should be clear. Effective metrics must be controllab...