Friday 3 May 2013

2013 Information Security Breaches Survey

The latest Information Security Breaches Survey is required reading if you care about information security risks.  The survey, commissioned from PwC by the British Government's Department for Business, Innovation and Skills, takes place every couple of years or so. The statistics are useful ... provided you take the trouble to think carefully about what you are being told.

Take for instance the following graphs and the associated commentary on page 6 of the technical report:


"Having a security policy is just the start; to prevent breaches, senior management need to lead by example and ensure staff understand the policy and change their behaviour.  Less than a quarter of respondents with a security policy believe their staff have a very good understanding of it; 34% say the level of understanding is poor.  There's a clear payback from investing in staff training.  93% of companies where the security policy was poorly understood had staff-related breaches versus 47% where the policy was well understood.  Worryingly, levels of training haven't improved much - 42% of large organizations don't provide staff with any ongoing security awareness training, and 10% don't even brief staff on induction.  Many instead seem to wait until they have a serious breach before training staff."
That's a whole lot of information to take in for starters but let's take a closer look:
  • The two graphs represent answers from about 150 respondents each (not necessarily the same people) out of the 1,402 who took the survey.  Page 1 of the report told us the margin of error for 100 respondents was about 10% at the 95% confidence level, so without doing the calculation, it is not unreasonable to assume a similar level of error - maybe 8% - with 150 respondents.

  • Page 1 also told us a little about the survey respondents.  Roughly half of the respondents were based in London and South-East England.  The survey is therefore biased towards that part of the world.

  • The respondents were in roughly equal proportions infosec pros, IT pros and business managers/execs. It seems fair to assume they have a reasonable understanding of their organizations' information security status. Infosec pros tend to be risk-averse by nature, while business managers/execs see risk in a more positive light, so perhaps those opposing biases cancel out? It's impossible to say for sure without more information.

  • Figure 9 separates out the numbers for large and small organizations in this year's survey, but those two categories were not identified separately in all the previous reports, making it tricky to compare.  The report indicates that the proportion of small businesses having a formally documented information security policy has fallen consistently from 67% in 2010, through 63% in 2012, to 54% now.  Given the ~8% margin of error, the differences may not be significant. 

  • Figure 10 has similar issues: the differences may not be significant. Nevertheless, it is interesting that about one third of the respondents only cover awareness of security threats at induction (orientation) time, while about half have a programme of ongoing education (whatever that means! Requiring staff to attend an awareness class once every year or so presumably qualifies as 'ongoing education' but we know just how ineffective that approach can be).

  • "Having a security policy is just the start" could be simply a throwaway phrase to kick off the commentary, although it clearly implies a sequence of events. Furthermore, the text implies that policy is an important vehicle for changing behaviours. Personally, I'm not totally convinced on either point - there are some unanswered questions there that could have been addressed by the survey or other research ... which reminds me: there are few if any references to other sources of information and statistics in the report.  Some of the topics discussed in the report have undoubtedly been examined by rigorous scientific studies, so why aren't they referenced?

  • The commentary provides some additional statistics, although the report's authors have been selective. Stating "Less than a quarter of respondents with a security policy believe their staff have a very good understanding of it; 34% say the level of understanding is poor." gives the impression that most respondents think employees don't understand their policies, but that is an interpretation of data that are incompletely presented in the report.

  • We are none the wiser on how PwC concluded that "Many instead seem to wait until they have a serious breach before training staff."  Maybe there was one or more survey questions along these lines. Maybe PwC reached this conclusion on the basis of their audit and consultancy work, independently of the survey. Maybe the reports authors just made it up to fill a gap - pure conjecture perhaps.  We're left guessing. 
While I have only discussed two graphs and about 130 words of commentary, a small part of the report's 19 or so pages, hopefully this has given you a clue about what I meant by 'thinking carefully about what we are being told' and, for that matter, what we are not being told.  The survey is well worth reading, although I recommend reading it critically to get the most value from it.  

PS  I wrote about security surveys over on the PRAGMATIC metrics blog some while back, concluding with "a very pragmatic bottom line: published security surveys are, on the whole, good enough to be worth using as security metrics. While many of us take them at face value, they are even more valuable if you have the knowledge and interest to consider and ideally compensate for the underlying issues and biases, thinking about them in PRAGMATIC terms."  

No comments:

Post a Comment

The floor is yours ...