Tuesday 4 July 2017

How many topics does your awareness program cover?


A piece on LinkeDin set me thinking this morning - actually several pieces did but I shall spare you my cynicism, other than to say that the unbelievable din of superficial marketing tripe, me-me self-promotion and sycophantic back-slapping on LinkeDin all but drowns out the few grains of useful content. 

But I digress.

Bucking the trend, IT Security Basics: A Basic IT Security Awareness Program for Your Employees by Marc Krisjanous does what it says on the tin, laying out the bare bones of a basic approach to IT security awareness aimed at 'employees' (in other words staff, users, the hoi palloi). Do have a read: Marc has some good ideas in there, a step or two up from the most naive approach that is a common starting point for awareness. 

On the other hand, and with all due respect to Marc, it falls short of good practice ... which thought made me mull over the lifecycle, the stages through which awareness programs typically develop, in other words another maturity scale.

-------------------------
[Before continuing, please take a moment to do something for me. Work out roughly how many topics your security awareness program has covered to date. I'm not talking about the scope, all those little things merit the odd mention here and there, but rather the specific focal points or issues that the awareness program, materials and activities focus on in some depth. Go ahead, check your list and tot 'em up. I'll explain why at the end of this piece.]
-------------------------

OK, bring on the maturity scale ...

Stage 0 - nothing: there are no security awareness activities whatsoever. That implies several things:
  • Negligible security awareness among employees in general, most being totally oblivious while a few may vaguely hope or believe that someone else 'does' security, whatever that means;
  • No interest in security awareness by management, presumably including the information/IT security people themselves (if there are any);
  • No roles and responsibilities in this area, and zero accountability. When (not if) incidents occur, the organization collectively takes the hit, and nobody feels compelled to do anything about it. Fingers point from each to the other;
  • An unnecessarily high level of information risk, hence those incidents I mentioned are likely to be both more numerous and more severe than they need be. Worse still, they come as nasty surprises, out of the blue, despite the possibility being glaringly obvious to any interested, security-aware onlookers (ransomware incidents being a highly topical example).
Stage 1 - starter: at this level, there are some awareness activities but they aren't really planned or managed as such - rather they are one-off or sporadic episodes, with no defined purpose or goal both individually and overall. The topics tend to be a more or less random selection, perhaps picking up on major incidents (such as ransomware) in a reactive way, arguably too late to achieve much benefit from awareness. The awareness materials are basic, to say the least - perhaps a lame poster lifted from the web (quite possibly infringing someone's copyright, since the lack of awareness may extend to the people 'doing' awareness). Stage 1 starter-level security awareness may be better than nothing, but only just!

Stage 2 - basic program: a program involves planning and management of the security awareness activities. Someone cares enough about it to determine what ought to be done and hopefully how and when to do it. However, the basic awareness program is typically run on a shoestring, either totally unfunded or seriously under-funded. There is little management interest or support, except perhaps the desire to do the least amount possible to satisfy compliance obligations (implying management's awareness of those obligations, at least). There's no real appreciation of the value of security awareness, a blind-spot that often extends to IT/cyber and information security in general. Due to the lack of funds, stage 2 programs are necessarily limited in scope and reach, for example targeting "users" (meaning certain IT users) with barely enough content to be worth distributing. This is paying lip-service, although management of stage 2 organizations may be aghast at being so labeled, due to their own lack of awareness.

Stage 3 - funded program: funding may indicate that management truly believes in the value of security awareness, but could also reflect a need to spend some spare cash, compliance pressure from the authorities, or drive from within (either individual leaders or departments such IT, Risk, Legal, Privacy, Audit or of course IT/Information Security). We see the first inkling of accountability at this level, management realizing that if the organization suffers serious incidents, the lack of an awareness program points directly to their lack of governance. However, the awareness program itself may be little more than the stage 2 version, with limited topics, restricted audiences and narrow goals (perhaps still undefined). A minimalist approach is common, limited to external (legal and regulatory) and perhaps internal (policy) compliance. 

Stage 4 - organization-wide program: extending the reach of the security awareness program to take in the entire organization takes things up a notch. It may not be immediately obvious but this seemingly innocuous extension, to me, marks a dramatic change of emphasis from IT/cybersecurity to information risk and security as a whole. A lowly office cleaner, for instance, has important information security responsibilities, even though he/she is unlikely to use corporate IT (except perhaps taking advantage of the guest WiFI to catch up with Facebook on a cheap smartphone during breaks!).  That's true even if he/she is a cleaning contractor employed by a service company, not actually an employee of the organization running the program. [This is why our security awareness materials refer to "workers" rather than "employees": we hope subscribers won't discriminate against third party maintenance people, contractors, consultants etc. working for the organization on-site.]  A nice refinement here is to identify distinct awareness audiences or groups within the organization, developing awareness content and activities specifically designed to appeal to and help them, supplementing the more generic stuff aimed at workers (not just [IT] users, remember!) in general.

Stage 5 - psychology: security awareness and training is adult education in the corporate context. As such, the science behind education is relevant and applicable, particularly the behavioral sciences within biology, including psychology. Appreciating the distinction between 'enforcement' and 'reinforcement', for instance, crucially divides awareness programs that are perceived negatively by their audiences from those that are positive. The typical compliance-based approach essentially involves warning workers about the dire consequences of non-compliance - the personal and organizational penalties arising. Emphasizing the business and personal benefits of addressing information risks through appropriate security controls takes the discussion to a different place, particularly for management. Organizations at stage 5 truly appreciate the need for motivation as well as information, and so take steps to motivate and encourage.  

Stage 6 - training and awareness system: large, mature organizations often have specialized training functions within or allied to HR. Their purpose is to assist with, if not actually deliver, training courses throughout the organization on a range of subjects and levels e.g. induction or orientation courses for new starters, compliance-driven courses, technical and skills-based training, and supervisory/management training. Learning Management Systems often come into play at this stage, opening the door for third party suppliers of training content. The systematic approach to awareness is another, more subtle element of stage 6. Although they usually focus on intensive training courses, specifically, the professionals in training functions often have the background and skills to assist with awareness activities as well, if only they have the time and inclination. They also have more than just a clue about good practice ...

Stage 7 - good practice: there is a diffuse set of characteristics defining or demonstrating good practices in security awareness, including:
  • Professionalization - by which I mean employing or promoting competent, experienced and talented security awareness and training professionals (ideally close-knit teams, not just lone individuals), giving them the latitude and support to both do stuff right and do the right stuff. Career progression is important for these people like all others, hence skills enhancement courses, projects and other personal development opportunities are worthwhile for the kinds of people who excel in these roles, and just as valuable as more money (within reason!); 
  • Interaction between information security or other specialists and the audiences, particularly in-person presentations, seminars, courses, workshops, demonstrations and so on, supplementing the typically rather dry, drab and lifeless written content. A suite of social skills is needed here, such as empathy ... which can be distinctly challenging for information security awareness people with classic IT/tech backgrounds and other personality types. Having said that, I'm relieved to note that the skills and competencies can be learnt and are certainly enhanced through practice;
  • Collaboration among and between specialists in different areas of expertise on shared awareness-related goals (e.g. health and safety plus site/physical security plus information security);
  • Standardization - both in the sense of turning the organization of awareness events and the production of awareness materials into repeatable and improvable sausage-machine operations, and by adopting the good practice advice in globally-respected standards such as ISO/IEC 27002 and NIST SP800-50;
  • Meaningful metrics - measuring the things that truly matter to the organization in achieving its goals, as a way to enable, direct, drive and demonstrate progress, value, effectiveness, efficiency, maturity etc. If your idea of a good security awareness metric is to graph the number of people who have attended your events, you have quite a journey ahead! Metrics turn standardization into continuous improvement;
  • Creativity and innovation - catching the eyes and imaginations of the audience groups naturally helps engage them fully with the program. There are further advantages to being creative and innovative with the content, the formats, the modes of delivery and so on, not least the topics. Given the time taken to prepare and deliver awareness, and for the audiences to absorb and react to it, your awareness topics ought to reflect not just present but future threats and information risks to the organization. Good luck even figuring those out far in advance, let alone preparing sensible content - and I should know: this is a substantial part of my role; 
  • High quality materials delivering both breadth and depth. As well as covering fewer topic areas, immature awareness programs tend to be quite superficial in their coverage. Some topics deserve, and some audiences need, more in-depth content, but at the same time it's easy to confuddle the general awareness audience, requiring a finesse to both the awareness messages and the awareness content.
Stage 8 - best practice: going beyond mere good practice, these are the award-winning awareness programs, figuratively if not literally. Best practice programs are outstanding in the field, highly effective and, in short, a roaring success. Their excellence is generally acknowledged by insiders (staff, managers and related third parties) and sometimes by outsiders ... although organizations at this level tend to be in intensely competitive industries and/or in national security, government and defense, they tend to be quite discreet about it. Discretion is part of security awareness, after all!  [Note: awards that can be bought rather than earned don't count, sorry. Integrity is part of information security.]

Stage 9 - cutting edge: whereas creative, experimental and innovative approaches to security awareness and training can come into play at all levels in a limited way, mature organizations that find good/best practices inadequate have little option but to push back the frontiers and strive for the ultimate. They go beyond best practice.  It's not really that best isn't good enough for them, rather they totally accept the value proposition for security awareness and see more to be gained by going beyond the obvious - for example, a genuine security culture means far more than the set of goals on some promotional poster. 

Stage 10 - dissolution: once information risk and security are utterly and deeply ingrained into the entire organization, there may be little need for a security awareness program as such. A strong security culture is inherently self-sustaining as vigilant, alert workers spot and react appropriately to information risks in an almost reflexive manner, hence paradoxically security awareness and training programs become less obvious at this level. The activities still occur but there is no longer any need to point them out since it is almost impossible to find any part of the organization, any person, any activity that is not inherently security-aware. Security has become "the way we do things around here". 

-------------------------

[OK, now, do you have that topic count I asked you for? The reason is that I suspect the number of topics might be a useful indicator of the maturity of an organization's awareness program. Simply divide your count by ten and check the correspondingly numbered stage, interpolating as appropriate. For example if your program has covered 14 topics since its inception, I would guess that puts you part way between stages 1 and 2. You probably exceed the criteria for stage 1 with some aspects of stage 2, perhaps even odd bits from later stages too. If your honest answer was zero, well I hope you would not be too surprised to be labeled a stage 0 organization! Notice the topic counts implied at the upper levels: here we're talking scores of topics, mostly likely spread over several years though since trying to squeeze too much into any one year is bound to be counterproductive: people will become confused and overloaded, tuning-out and disregarding the awareness messages. Notice I'm calling this an indicator, not a rigorous scientific metric based on known cause-and-effect relationships. There are conceivably fabulous awareness programs covering only a few topics, and crappy ones supposedly covering loads. However, I think as a general indicator it might be 'close enough for government work', and virtually free too - a valuable combination in security metrics.]

PS  I'd love to know whether the awareness maturity model is sound and whether the suggested indicator works for your organization. You don't need to disclose the number of awareness topics or the stage you believe you have reached - in fact if you are above the very bottom level you are hopefully security-aware enough to realize that such disclosure may not be a good idea. Nevertheless, I'm keen to know if it it sufficiently accurate and helpful for me to develop and publish this blog piece more widely. If not, how can I improve it? What have I missed or got wrong? Over to you ...

No comments:

Post a Comment

The floor is yours ...