Posts

Showing posts from July, 2012

SMotW #17: audit findings

Security Metric of the Week #17:  n umber and severity of audit findings  Our latest 'security metric of the week' builds on the following premises.   Firstly, the number and severity of audit findings bears some relationship to the state or maturity of the organization's governance, risk, compliance and security arrangements, along with the number, quality, scope and depth of the audits. Secondly, si nce audits are invariably independent and formal, the number of audit findings is an objective, cheap and easy-to-obtain measure, as is the 'severity' ( or gravity or importance)   provided findings are routinely rated/classified by the auditors, which they usually are.   The severity of audit findings also helps focus management attention on the issues that really matter. [We are of course assuming that "audit finding" is a recognized term.  Most if not all audit functions generate reports that identify and discuss discrete findings.  Many also explicitly

Security awareness/training metrics

An interesting discussion on the ISO27k Forum concerns measuring security awareness and training activities.  Most of the measures proposed so far have been  'input' or 'process' metrics such as e valuation sheets measuring things such as  the quality of the venue, the course materials, the food served, the tutor (even the parking spaces!).  Many organizations collect basic data such as the number of attendees at awareness and training events, and the number of events attended by each employee in the course of a year. Input measurements of this nature are relatively cheap and easy to collect, in volumes large enough for meaningful statistical analysis, and some of those statistics are actually useful for management purposes (e.g. distinguishing good from bad trainers, and identifying areas for improvement in their techniques or the materials or the venue).   A few may even be required for compliance reporting purposes against [senseless] regulatory requirements such as

Trailblazing the compliance jungle

I first came across the  Unified Compliance Project (UCP)  about 5 years ago when it was run by  Dorian Cougias for the IT Compliance Institute (ITCi). While information security-related compliance obligations were mushrooming, UCP  aimed to  simplify, harmonize and perhaps even unify the laws, standards and regulations in this area.  ITCi evidently  turned up its toes in 2008 , passing the UCP baton to  Network Frontiers LLC where it became the UCF ( Unified Compliance Framework ). Fast forward to 2012. Dorian remains in the driving seat for UCF along with lawyer Marcelo Halpern and Network Frontiers CEO Craig Isaacs.   Having apparently invested around $9m pulling together the content from a wide variety of laws, regulations, standards etc ., plus $1m for the database to amalgamate, analyze and regurgitate requirements, UCF is now in a position to sell the information and expertise to bewildered organizations that are keen to identify and fulfill their compliance obligations. UCF'

SMotW #16: policy noncompliance

Security Metric of the Week #16: number of security policy noncompliance infractions detected The extent to which employees comply with the organization's security policies sounds like the kind of thing that management might want to track and, where appropriate, improve.  This week's metric is a typical, if rather naive attempt to measure policy compliance ... by counting noncompliance incidents. Policies are 'mandated', in other words management expects everyone to comply with them unless there are justified reasons not to comply (meaning authorized exemptions for those organizations that are mature enough to appreciate the need to manage this aspect carefully).  While management originates most of the security requirements documented in policies, some derive from external obligations under applicable laws, regulations or agreements with third parties ( e.g. PCI-DSS).   The metric's wording implies that unauthorized noncompliance 'infractions' (more ofte

Pre-PRAGMATIC

Hitherto - before the  PRAGMATIC  method was invented - deciding which security metrics to measure was a black art, a highly subjective decision making process.   One might even question whether organizations actually 'select' security metrics deliberately, systematically and rationally.   Think about that for a moment.  Why does your organization measure whatever it does measure in relation to information security?  Does that mean that management doesn't care about all the other security stuff you could also measure?  Does it really matter what security metrics you use? Pre- PRAGMATIC  organizations presumably measure certain facets of information security because:  They are cheap and easy to report, typically because  the raw numbers are readily available (some systems generate pretty graphs straight out of the box, but does management  need  them?) ; They are recommended by someone, peers claim to measure them, or the organization is  required  to report them by some thi

Storms on the horizon

A new NIST standard SP800-146 cloud computing synopsis and recommendations set me thinking yet again about the B usiness C ontinuity aspects of cloudiness. I should start by explaining that I believe effective BC involves engineering an appropriate combination of resilience , recovery and contingency arrangements, wrapped up in a nice package of incident management for good measure. These four complementary approaches (five if you include the implied risk analysis) help ensure the continuity of critical business processes, along with the associated IT and comms services normally supporting them.   Of these aspects, the standard mostly considers disaster recovery, and then only at a superficial level (it is a 'synopsis' after all). For example, paragraph 8.3.5 says: " Disaster recovery involves both physical and electronic mishaps with consumer assets. For natural  disasters, replication of data at geographically distributed sites is advisable. For other physical disaster

SMotW #15: HR security maturity

Security Metric of the Week #15: Human Resources security maturity In order to explain the PRAGMATIC score for this week's example security metric, we first need to introduce you to the concept of security maturity metrics .  Bear with us. Section 8 of ISO/IEC 27002:2005 lays out a suite of HR-based information security controls that apply to the pre-, para- and post-employment phases.   For example, prior to offering anyone a role, especially in a powerful/trusted position, wise organizations conduct suitable background checks to weed-out unsuitable candidates.  For highly sensitive government and military work, the security clearance process can involve an extensive range of checks including credit worthiness, criminal history, identity, qualifications, professional experience and character references, in addition to a structured interview process and on-the-job supervision/oversight during a probationary period.  For low-grade positions such as office cleaners, pre-employment

PRAGMATIC Security Metric of the Quarter #1

PRAGMATIC Security Metric of the First Quarter Having scored and discussed fourteen  Security Metrics of the Week during the first three months of this blog, it seems appropriate now to take a step back, review the metrics we have discussed thus far and consider the value of the  PRAGMATIC   process.   H ere are the fourteen security metrics, tabulated in descending order of their overall  PRAGMATIC scores.  Click any of the metrics to read more about them and discover why we scored them thus. Metric P R A G M A T I C    Score Discrepancies between physical location and logical access location 75 76 72 90 82 75 85 83 60 78% Information security policy coverage 75 82 92 78 80 70 73 60 81 77% Unowned information asset days 40 51 84 77 74 86 92 94 82 76% Number of unsecured access points 95 80 90 70 85 77 45 75 55 75% % of critical co