Saturday 30 September 2017

Complying with Finagle's Law


Finagle's law elaborates on Sod's law: not only will anything that can go wrong, go wrong, but it will do so at the worst possible time.

With our self-imposed end of month deadline fast approaching, October's awareness module was close to being completed ... until a hardware failure caused a day's delay. A solid state disk drive gave up the ghost without warning last night. Naturally being highly security-aware we have backups, lots of backups, but rebuilding/restoring the system on a new disk inevitably takes time. Bang went my Saturday!

October's module is entirely new, being a new awareness topic for us, so it has taken longer than normal to prepare the module, leaving little slack in our schedule. Such is life. So, tomorrow I'll be slogging through what remains of the weekend, doing my level best to catch up and complete the materials for delivery on Monday, hopefully.

On the upside, our backups worked! We had enough spare hardware to survive this incident with relatively little impact except a day's lost work and elevated stress levels. An unplanned business continuity exercise.

Friday 29 September 2017

Strategic alignment

On the ISO27k Forum this morning, a member from a financial services company asked for some advice on aligning IT and Security with overall corporate/business strategies.  He said, in part: 
"Organizational level strategic plan, covering its core business, has been derived. And it includes what is expected form Technology and Security departments,  I.E. to keep customers, shareholders happy and to provide safe and secure technology services.   
[I need] to prepare a strategic plan decoded from organization's strategy, specifically for Technology and Security department, with goals, objectives, principles etc.  So for achieving this, my approach is to understand each business strategy and determine the possible ways that Technology and Security team can help it. 
Business strategy -> Technology strategy -> Security Strategy"
I strongly support the idea of explicitly linking 'our stuff' with corporate/business strategies (plus initiatives, projects and policies) but 'our stuff' is more than just technology security, or IT security, or cybersecurity, or data security .... I encourage everyone to refer to information risk, defined as 'risk pertaining to information', an all-encompassing term for what we are managing and doing. Especially in the strategic context, we should all be thinking beyond securing bits and bytes.  

[The mere fact that they have a department, team or whatever named "Security" that he and presumably others consider a part of, if not very closely tied to, "Technology", strongly suggests a very IT-centric view in the organization. To me, there's the merest whiff of a governance issue there: treating this as 'IT's problem', with the emphasis on security (as in controls, restrictions and prohibitions, as much as protection and safety) is a common but, in my view, sadly misguided and outdated approach - a widespread cultural issue in fact.]

Identifying information risk aspects of the corporate strategies is a creative risk assessment activity. In stark contrast to financial risks, information risks tend to be largely unstated, if not unrecognized, at that level but can generally be teased out from the assumptions (both explicit and implicit). For instance, if a business strategy talks about "Expanding into a new market", consider what that actually means and how it will be achieved, then examine each of those proposed activities for the associated information risks - including for instance the information risk that the 'new market' opportunity has been misunderstood or misstated (often by whoever is eagerly promoting the approach, an obvious bias that experienced managers are adept at discounting in others, yet curiously reluctant to admit in themselves!). If it goes ahead, management are making significant assumptions that the market exists and is profitably exploitable using the proposed strategic approach but what if they are wrong? What if the projections are unrealistic (overly optimistic or pessimistic: remember risk cuts both ways)? What if the assumptions turn out to be unfounded? What if 'something else happens'? These are just some of the information risks concerning a proposal that is being used as the basis for strategic business decisions - a high-stakes situation for sure. In addition, there are the more obvious implications on Security of going ahead with the strategy (e.g. finding the information risk and security specialists needed to support and guide the new market activity) plus other more subtle effects (e.g. diverting attention and resources from more mundane but potentially just as risky stuff).

Doing that kind of risk assessment properly and thoroughly is a lot of work - a major and potentially difficult and costly undertaking, involving business managers plus specialists from Security, Risk Management, IT, HR, Compliance, Business Continuity, Audit etc. It's a team effort, supporting and enabling each other and negotiating for the best overall outcome for the business as a whole. If that's not feasible given the current circumstances, maturity level and resources, then I recommend at least focusing on and clearly prioritizing risk associated with the organization's most valuable and/or vulnerable information assets. In financial services, customer financial data is undeniably worth protecting, so there should be little argument if the strategy lays out whatever that involves. Other things may dangle from that handy hook, within reason, but still it's better to be able to show that every single item in the strategy or plan relates to something that the business has identified as a driver, goal, objective etc. [OK, some of those relationships might be tenuous in practice, but still it's hard for managers to resist or block activities that relate to strategic goals - potentially career-limiting in fact.]

Especially if we are able to do this properly, a significant advantage is that the business drivers for information risk and security form an excellent basis for security metrics: if our metrics measure those things, we can reasonably expect management to take notice and use them. If not, why are we wasting their time with irrelevancies? In other words, we can squeeze extra mileage out of the strategy development process by picking out the associated metrics that will help achieve the strategy. It's a win-win.

Don't forget that strategy is relatively long-term big-picture stuff. This is our big chance to plan the foundations for the future development and maturity of information risk and security management in the organization: it's not just about tagging dutifully along behind whatever the business is doing, but also setting things up so the business has more, better options going forward. It's part of 'business enablement'. If, for example, I would love (for sound business and professional reasons, you understand) to set up a superb Security Operations Centre but have so far been denied the opportunity, are there things we can do over the next year or so to set things up and get the process running so that, maybe in a few years time, the SOC is more likely then to be approved? The strategy development process is like a chess game: we need to think several moves ahead, and consider what the other players are doing and how they will respond to our moves. It's also a competitive team game: as much give as take. Call it back-scratching or horse-trading if that helps.

Thursday 28 September 2017

Safe & secure


The Coming Software Apocalypse is a long, well-written article about the growing difficulties of coding extremely complex modern software systems. With something in the order of 30 to 100 million lines of program code controlling fly-by-wire planes and cars, these are way too large and complicated for even gifted programmers to master single-handedly, while inadequate specifications, resource constraints, tight/unrealistic delivery deadlines, laziness/corner-cutting, bloat, cloud, teamwork, compliance assessments plus airtight change controls, and integrated development environments can make matters worse. 

Author James Somers spins the article around a central point. The coding part of software development is a tough intellectual challenge: programmers write programs telling computers to do stuff, leaving them divorced from the stuff - the business end of their efforts - by several intervening, dynamic and interactive layers of complexity. Since there's only so much they can do to ensure everything goes to plan, they largely rely on the integrity and function of those other layers ... and yet despite being pieces of a bigger puzzle, they may be held to account for the end result in its entirety.

As if that's not bad enough already, the human beings who actually use, manage, hack and secure IT systems present further challenges. We're even harder to predict and control than computers, some quite deliberately so! From the information risk and security perspective, complexity is our kryptonite, our Achilles heel.

Author James Somers brings up numerous safety-related software/system incidents, many of which I have seen discussed on the excellent RISKS List.  Design flaws and bugs in software controlling medical and transportation systems are recurrent topics on RISKS, due to the obvious (and not so obvious!) health and safety implications of, say, autonomous trains and cars.

All of this has set me thinking about 'safety' as a future awareness topic, given the implications for all three of our target audiences:
  1. Workers in general increasingly rely on IT systems for safety-critical activities. It won't be hard to think up everyday examples - in fact it might be tough to focus on just a few!

  2. With a bit of prompting, managers should readily appreciate the information risks associated with safety- and business-critical IT systems, and would welcome pragmatic guidance on how to treat them;

  3. The professional audience includes the programmers and other IT specialists, business analysts, security architects, systems managers, testers and others at the sharp end, doing their best to prevent or at least minimize the adverse effects when (not if) things go wrong. By introducing the integration and operational aspects of complex IT systems in real-world situations, illustrated by examples drawn from James Somers' article and RISKS etc., we can hopefully get them thinking, researching and talking about this difficult subject, including ways to bring simplicity and order to the burgeoning chaos.
Well that's the outline plan, today anyway. No doubt the scope will evolve as we continue researching and then drafting the materials, but at least we have a rough goal in mind: another awareness topic to add to our bulging portfolio.

Wednesday 27 September 2017

Compliance culture

A discussion thread on CISSPforum about the security consequences of (some) software developers taking the easy option by grabbing code snippets off the Web rather than figuring things out for themselves (making sure they are appropriate and, of course, secure) set me thinking about human nature. We're all prone to 'taking the easy option'. You could say humans, and in fact all animals, are inherently lazy. Given the choice, we are inclined to cut corners and do the least amount possible, making this the default approach in almost all circumstances. We'd rather conserve our energy for more important things such as feeding and procreating.

Yesterday, Deborah mentioned being parked at a junction in town near a one-way side road. In the few minutes she was there, she saw at least 3 cars disregard the no-entry signs, breaking the law rather than driving around the block to enter the side road from the proper direction. Sure they saved themselves a minute or so, but at what cost? Aside from the possibility of being fined, apparently there's a school just along the side road. It's not hard to imagine kids, teachers and parents rushing out of school in a bit of a hurry to get home, looking 'up the road' for oncoming vehicles and not bothering to look 'down the road' (yes, they take the easy option too).

The same issue occurs often in information security. 'Doing the right thing' involves people minimizing risks to protect information, but there's a cost. It takes additional time and effort, compared to corner-cutting. 

Recognizing that there is a right and a wrong way is a starting point - easy enough when there are bloody great "No entry" signs on the road, or with assorted warning messages, bleeps, popup alerts and so forth when the computer spots something risky such as a possible phishing message. Informing people about risks and rules is part of security awareness, but it's not enough. We also need to persuade them to act appropriately, making the effort that it takes not to cut the corner.

You may think this is a purely personal matter: some people are naturally compliant law-abiding citizens, others are naturally averse to rules (sometimes on principle!), with a large swathe in the middle who are ambiguous or inconsistent, some plain ignorant or careless. How they react depends partly on the particular circumstances, including their past experience in similar situations ... which hints at another aspect of security awareness, namely the educational value of describing situations, explaining the consequences of different courses of action, guiding people in how they should respond and ideally getting them to practice until 'doing the right thing' becomes the default.

However, there is also a cultural aspect to this: social groups vary in their compliance: compare driving standards in, say, Sweden with Italy for a clear demonstration of cultural differences at a national level. In practice, traffic lights, signs, rules and laws are at best advisory (derisory you might say!) in much of the Mediterranean.

In the information security context, such cultural distinctions can make a huge difference to the way we express and enforce the rules necessary to protect information. Management in compliant organizations can develop, publish and mandate security policies and procedures, knowing employees will respect them (most of the time anyway), whereas in noncompliant organizations that approach alone would be inadequate - barely even the first stage. Additional activities would be needed to both reinforce and enforce compliance. That's potentially a large hidden cost arising from noncompliance, especially if applies equally to all sorts of rules: tax laws, bribery and corruption, driving, privacy, intellectual property rights and so on.

Having just made a case for a culture of compliance, I should say that compliance per se is not the ultimate goal. One could argue that safety - not compliance - is the true objective of road signs, speed limits etc. From that perspective, compliance is merely a way to achieve the objective. So long as most of the drivers in Rome play the same game and stay reasonably safe, compliance with the road laws is incidental. [Judging by the proportion of beaten-up cars on the roads, I don't think the collision avoidance and hence safety objective is being met either, but that's a subjective opinion based on my cultural background!].

Monday 25 September 2017

Five-step bulletproofing?

In the course of searching for case study materials and quotations to illustrate October's awareness materials, I came across 5 ways to create a bulletproof security culture by Brian Stafford. Brian's 5 ways are, roughly: 
  1. Get Back to Basics - address human behaviors including errors. Fair enough. The Information Security 101 awareness module we updated last month is precisely for a back-to-basics approach, including fundamental concepts, attitudes and behaviors.

  2. Reinvent the Org Chart - have the CISO report to the CEO. Brian doesn't explain why but it's pretty obvious, especially if you accept that the organization's culture is like a cloak that covers everyone, and strong leadership is the primary way of influencing it. The reporting relationship is only part of the issue though: proper governance is a bigger consideration, for example aligning the management of information risks and assets with that for other kinds of risk and asset. Also security metrics - a gaping hole in the governance of most organizations.

  3. Invest in Education - "Any company that seeks to have a strong security culture must not only offer robust trainings to all employees—including the c-suite—but also encourage professional development opportunities tailored to their unique focus areas." Awareness, training and education go hand-in-hand: they are complementary.

  4. Incentivize & Reward Wanted Behavior e.g. by career advancement options. Again, the InfoSec 101 module proposes a structured gold-silver-bronze approach to rewards and incentives, and I've discussed the idea here on the blog several times. Compliance reinforcement through rewards and encouragement is far more positive and motivational than the negative compliance enforcement approach through pressure, penalties and grief. Penalties may still be necessary but as a last resort than the default option.

  5. Apply the Right Technology - hmm, an important consideration, for sure, although I'm not sure what this has to do with security culture. I guess I would say that technical controls need to work in concert with non-tech controls, and the selection, operation, use and management of all kinds of control is itself largely a human activity. The fact that Brian included this as one of his 5 ways betrays the widespread bias towards technology and cybersecurity. I'd go so far as to call it myopic.

Personally, and despite our obvious efforts in this area, I'd be very reluctant to state or imply that an organization's security culture could ever be considered bulletproof, not even in the purely rhetorical sense. It's an important part of a bigger set of things, one that happens to be relevant to most of information risk, security, privacy, compliance, governance and so on, but culture, alone, won't deflect bullets: knowing that, and being ready and willing to handle the consequences of incidents, is itself characteristic of a robust security culture.

Saturday 23 September 2017

Security culture sit rep

October's awareness module is gradually taking shape. The management and professionals' seminar slide decks and notes are about 80% done. They're quite intenst, earnest and rather dull though, so we need something inspiring to liven things up a bit. More thinking and digging around required yet.

Meanwhile, the staff/general materials are coming along too. The next 7 days will be busy, systematically writing, revising, aligning and polishing the content until it gleams and glints in the sun - talking of which, we set the clocks forward an hour tonight for summer time: it has been a long, wet NZ Winter this year.


Friday 22 September 2017

Cultured security

Aside from concerning the attitudes and values shared within groups, or its use in microbiology (!), there's another meaning of 'culture' relating to being suave and sophisticated. 

In the information risk and security context, it's about both being and appearing professional, exuding competence and quality - and that can be quite important if you consider the alternative. 

Given the choice, would you be happy interacting and doing business with an organization that is, or appears to be, uncultured - crude, slapdash, unreliable etc.? Or would you be somewhat reluctant to trust them?

There are some obvious examples in the news headlines most weeks: any organization that suffers a major privacy breach, hack, ransomware or other incident comes across as a victim and arguably perhaps culpable for the situation. It's hardly a glowing endorsement of their information risk, security, privacy and compliance arrangements! Contrast their position against the majority of organizations, particularly the banks that exude trustworthiness. Corporate cultures, brands and reputations are bound strongly together.

The two meanings of 'culture' are linked in the sense that the overall impression an organization portrays is the combination of many individual factors or elements. Through marketing, advertising and promotions, public relations, social media etc., management naturally strives to present a polished, impressive, business-like, trustworthy external corporate image, but has limited control over all the day-to-day goings on. Myriad interactions between workers and the outside world are largely independent, driven by the individuals, individually, and by the corporate culture as a whole.

Management may try to control the latter, espousing 'corporate values' through motivational speeches and posters, but in most organizations it's like herding cats or plaiting fog. Much like managing change, managing the corporate culture is a tough challenge in practice. Realistically, the best management can hope for is to influence things in the right direction, perhaps rounding-off the sharpest corners and presenting a more consistently positive front.  

I'm talking here about the organization's integrity, one of the three central information properties alongside confidentiality and availability. Protecting, enhancing and exploiting the organization's culture is a core issue for information security, one that includes but extends well beyond the very limited domain of cybersecurity.

That in turn makes 'security culture' a valuable topic for the security awareness program, and makes the program a valuable part of running the business. The awareness materials and activities are not just meant to inform and influence individuals one-by-one, but to mold the overall corporate culture in a more generalized way. We're not just addressing 'users', computer systems, networks and apps. An effective awareness program deliberately envelopes everyone in all parts and at all levels of the organization. 

The awareness stream aimed at management will be particularly important in October's module. Our intention is to convince managers that:
  1. Although they may never have considered it before, the corporate security culture really matters to the organization - it's very much a business issue;

  2. While culture is largely an emergent property of dynamic social groups and interactions, it can be influenced, if not actually controlled, through sustained and deliberate actions - it's a strategic business issue;

  3. The security awareness program is a viable and valuable mechanism to influence the corporate security culture;

  4. Managers themselves are part of the strategic approach e.g. not merely mandating staff compliance with security and privacy rules through directives, policies and procedures, but walking-the-talk, demonstrating their personal concerns and proactively supporting information risk, security, privacy, compliance etc. - in other words showing leadership.

Wednesday 20 September 2017

Phishing awareness & cultural change


This plopped into my inbox last evening at about 8pm, when both ANZ customers and the ANZ fraud and security pros are mostly off-guard, relaxing at home. It's clearly a phishing attack, obvious for all sorts of reasons (e.g. the spelling and grammatical errors, the spurious justification and call to action, the non-ANZ hyperlink, oh and the fact that I don't have an ANZ account!) - obvious to me, anyway, and I hope obvious to ANZ customers, assuming they are sufficiently security-aware to spot the clues.

I guess the phishers are either hoping to trick victims into disclosing their ANZ credentials directly, or persuade them to reveal enough that they can trick the bank into accepting a change of the mobile phone number presumably being used for two-factor authentication, or for password resets.

Right now (8 am, 12 hours after the attack) I can't see this particular attack mentioned explicitly on the ANZ site, although there is some basic guidance on "hoax messages" with a few other phishing examples. The warnings and advice are not exactly prominent, however, so you need to go digging to find the information, which means you need to be alert and concerned enough in the first place, which implies a level of awareness - a classic chicken-and-egg situation. I presume ANZ has other security awareness materials, advisories and reminders for customers. If not, perhaps we can help!

Aside from the authentication and fraud angle, I'm interested in the cultural aspects. Down here in NZ, people generally seem to be quite honest and trusting: it's a charming feature of the friendly and welcoming Pacific culture that pervades our lives. Given its size and history, things may be different in Australia - I don't know. But I do know that phishing and other forms of fraud are problematic in NZ. The Pacific culture is changing, becoming more careful as a result of these and other scams, but very slowly. Increasing distrust and cynicism seems likely to knock the corners off the charm that I mentioned, with adverse implications for tourism and commerce - in other words cultural changes can create as well as solve problems. 

The same issue applies within organizations: pushing security awareness will lead (eventually, if sustained) to changes in the corporate culture, only some of which are beneficial. It's possible to be too security-conscious, too risk-averse, to the point that it interferes with business. October's awareness seminar and briefings for management will discuss a strategic approach aiming to settle the organization's security culture in the sweet spot somewhere between the two extremes, using suitable metrics to guide the process.

Tuesday 19 September 2017

What is 'security culture'?

For some while now, I've been contemplating what security culture actually means, in practice. 

Thinking back to the organizations in which I have worked, they have all had it some extent (otherwise they probably wouldn't have employed someone like me!) but there were differences in the cultures. What were they?

Weaknesses in corporate security cultures are also evident in organizations that end up on the 6 o'clock news as a result of security and privacy incidents. In the extreme, the marked absence of a security culture implies more than just casual risk-taking. There's a reckless air to them with people (including management - in fact managers in particular) deliberately doing things they know they shouldn't, not just bending the rules and pushing the boundaries of acceptable behavior but, in some cases, breaking laws and regulations. That's an insecurity culture!

The strength of the security culture is a relative rather than absolute measure: it's a matter of degree. So, with my metrics hat on, what are the measurable characteristics? How would we go about measuring them? What are the scales? What's important to the organization in this domain?

A notable feature of organizations with relatively strong security cultures is that information security is an endemic part of the business - neither ignored nor treated as something special, an optional extra tacked-on the side (suggesting that 'information risk and security integration' might be one of those measurable characteristics). When IT systems and business processes are changed, for instance, the information risk, security and related aspects are naturally taken into account almost without being pushed by management. On a broader front, there's a general expectation that things will be done properly. By default, workers generally act in the organization's best interests, doing the right thing normally without even being asked. Information security is integral to the organization's approach, alongside other considerations and approaches such as quality, efficiency, ethics, compliance and ... well ... maturity.  

Maturity hints at a journey, a sequence of stages that organizations go through as their security culture emerges and grows stronger. That's what October's security awareness content will be addressing, promoting good practices in this area. Today I'll be exploring and expanding on the maturity approach, drawing conceptual diagrams and thinking about the governance elements. What would it take to assemble a framework facilitating, supporting and strengthening the corporate security culture? What are the building blocks, the foundations underpinning it? What does the blueprint look like? Who is the architect?

Where does one even start? 

I've raised lots of rhetorical questions today. Come back tomorrow to find out if we're making progress towards answering any of them! 

Friday 15 September 2017

Symbolic security


An article bemoaning the lack of an iconic image for the field of “risk management” (e.g. the insurance industry) applies to information risk and security as well. We don’t really have one either. 

Well maybe we do: there are padlocks, chains and keys, hackers in hoodies and those Anonymous facemasks a-plenty (a minute's image-Googling easily demonstrates that). Trouble is that the common images tend to emphasize threats and controls, constraints and costs. All very negative. A big downer.

Information risk and security may never be soft and cuddly ... but I'm sure we can do more to distance ourselves from the usual negative imagery and perceptions. I really like the idea of information security being an enabler, allowing the organization do stuff (business!) that would otherwise be too risky. So I'll be spending idle moments at the weekend thinking how to sum that concept up in an iconic image. Preferably something pink and fluffy, with no threatening overtones.

Wednesday 13 September 2017

Surveying the corporate security culture

Inspired perhaps by yesterday's blog about the Security Culture Framework, today we have been busy on a security culture survey, metrics being the first stage of the SCF. We've designed a disarmingly straightforward single-sided form posing just a few simple but carefully-crafted questions around the corporate security culture. 

Despite its apparent simplicity, the survey form is quite complex with several distinct but related purposes or objectives:
  • Although the form is being prepared as an MS Word document with the intention of being self-completed on paper by respondents (primarily general staff), the form could just as easily be used for an online survey on the corporate intranet, a survey app, or a facilitated survey (like shoppers being stopped in the shopping mall by friendly people with clipboards ... and free product samples to give away).

  • The survey form is of course part of our security awareness product, linking-in with and supporting the other awareness content in October's module on 'security culture', and more broadly with the ongoing awareness program.  The style and format of the form should be instantly familiar to anyone who has seen our awareness materials. 

  • A short introduction on the form succinctly explains what 'security culture' means and why it is of concern and value to the organization, hence why the survey is being carried out. I'm intrigued by the idea of positioning the entire organization as a ‘safe pair of hands’ that protects and looks after information: a reasonable objective given the effort involved in influencing the corporate security culture. Even the survey form is intended to raise awareness, in this case making the subtle point that management cares enough about the topic to survey workers' security-related perceptions and behaviors including their attitudes towards management. 

  • Conducting the survey naturally implies that management will consider and act appropriately on the results. We take that implied obligation seriously, and will have more to say about it in the module's train-the-trainer guide. The survey is more than just a paper exercise or an awareness item: respondents will have perfectly reasonable expectations merely as a result of participating.

  • The survey questions themselves are designed to gather measurable responses i.e. data on a few key criteria or aspects of 'security culture'.  We have more work to do on the questions, and even when we're done we hope our customers will adapt them to suit their specific needs (e.g. if there is an organization-wide issue around compliance, it might be worth exploring attitudes and perceptions in that area to tease out possible reasons for that).  For starters, though, the questions are extremely simple -  at face value, very quick and easy to read and answer - and yet given sufficient responses, the survey is a powerful, statistically valid and meaningful metric measuring a complex, multi-faceted and dynamic social construct. No mean feat that!

  • It would be feasible to develop further forms to survey populations other than 'general employees'. I'm thinking particularly of management and perhaps third parties: how does the corporate security culture appear from their perspectives? What concerns them? Are there issues that deserve concerted action? We may not have the time to prepare forms for October's awareness module ... but we might pose that suggestion to our subscribers, again in the train-the-trainer guide.

  • Beneath each of the questions are spaces for respondents to comment, plus we encourage respondents to make their views known either on the reverse or (to maintain their anonymity) on a separate sheet, web page or email. We take the interactive approach quite deliberately and routinely because there's a lot of value to be gained by getting workers to open up a little and mention things that concern or interest them, from their perspectives and in their terms. In the particular context of the survey, we want to give respondents the opportunity to explain, expand or elaborate on the numeric responses if they feel the need. It's surprising just how powerful and insightful quotes direct from the horse's mouth can be. Pithy quotations make excellent content to illustrate and pep-up management reports and further awareness materials.

  • Mentioning 'free product samples' and 'sufficient responses' suggests the possibility of offering some sort of inducement for people to complete the survey, other than the opportunity to express their opinions and hopefully influence management. I have previously mentioned the gold-silver-bronze 'award menu' included in the Information Security 101 module: bronze level rewards would be ideal for this purpose. [Provided the anonymity aspect is addressed, a more attractive silver or gold award could be offered in, say, a prize draw: given the potential business value of the information generated by a well-designed survey, that's not a bad investment.]
So there we go. All we have to show for a whole day's work is a single page survey form (oh, and this blog piece!), illustrating once again the key point I made in relation to the elevator pitch for Information Security 101: the shortest, pithiest awareness pieces are often the hardest to prepare. Less really is more!

Tuesday 12 September 2017

Book review: Build a Security Culture


In preparing for our forthcoming awareness module on security culture, I've been re-reading and contemplating Kai Roer's Security Culture Framework (SCF) - a structured management approach with 4 phases.

1. Metrics: set goals and measure

Speaking as an advocate of security metrics, this sounds a good place to start - or at least it would be if SCF explored the goals in some depth first, rather than leaping directly into SMART metrics: there's not much point evaluating or designing possible metrics until you know what needs to be measured. In this context, understanding the organization's strategic objectives would be a useful setting-off point. SCF talks about 'result goals' (are there any other kind?) and 'learning outcomes' (which implies that learning is a goal - but why? What is the value or purpose of learning?): what about business objectives for safely exploiting and protecting valuable information?

SCF seems to have sidestepped more fundamental issues. What is the organization trying to achieve? How would what we are thinking of doing support or enable achievement of those organizational objectives? Security awareness, and information security as a whole, is not in itself a goal but a means to an end. I would start there: what is or are the ends? What is information security awareness meant to achieve? 

Having discussed that issue many times before, I'm not going to elaborate further on today, here except to say that if the Goals are clear, the Questions arising are fairly obvious, which in turn makes it straightforward to come up with a whole bunch of possible Metrics (the GQM method). From there, SMART is not such a smart way to filter out the few metrics with a positive value to the organization, whereas the PRAGMATIC metametrics method was expressly designed for the purpose.

SCF further muddies the waters by mentioning a conventional Lewin-style approach to change management (figure out where you are, identify where you want to be, then systematically close the gap) plus Deming's Plan-Do-Check-Act approach to quality assurance. I'm not entirely convinced these are helpful in setting goals and identifying measures. I would have preferred to elaborate on the process of analyzing the organization's core business, teasing out the 'hooks' in the business strategies on which to hang information security and hence security awareness. Those are powerful drivers, not least because only a fool would seriously resist or interfere with something that explicitly supports or enables strategic business objectives - a career-limiting move, to be sure!

2. Organization: involve the right people

Involving the right people makes sense for any activity including the previous step in SCF - in other words, the right people need to be involved in defining and clarifying the organization's objectives, which means these two activities overlap. Despite the numbering, they are not entirely sequential. The right people must be actively engaged in setting goals initially, and in deciding who else needs to be involved.

Sequencing issues aside, the second module of SCF discusses ways to identify 'the right people' for two distinct purposes: (1) those who will run the 'security culture program' (whatever that is! It is undefined at this stage); and (2) the target audience for security awareness (again, part of the vague 'security culture program').  

I fully support the idea of identifying awareness audiences, which is why our awareness service delivers three parallel streams of content aimed at workers in general, managers and professionals. While we don't subdivide those audiences, we recommend that the security awareness professionals to whom the materials are delivered do so - it's standard advice in the train-the-trainer guide in virtually every awareness module to identify who has an interest in the monthly topic, and work with them to customize, communicate, inform and persuade. In many cases that comes down to business departments or functions, and sometimes individual people (e.g. the Privacy Officer clearly needs to be actively engaged in privacy awareness, along with the Legal/Compliance function - or their equivalents since their titles, responsibilities and interests may vary). 

SCF picks out executives, HR and Marketing as obvious examples of groups you would probably want to involved, and fair enough ... although I can think of many more (such as the two mentioned above). In fact it's hard to think of any part of the organization that could safely be excluded, given that information flows throughout the entire organization like a nervous system.

SCF mentions the idea of nominating ambassadors or champions, hinting at the process we call 'socializing information risk and security'. It also mentions the need for regular communications of tailored messages - good stuff.

3. Topics: choose activities

The advice here is to "Build culture that works by choosing relevant topics and activities". I'm confused by 'culture that works' but in practice determining the security awareness and training topics is the focus of this module, and that's quite straightforward.  There's sound advice here:
"One thing to note about topics is that it is highly unlikely, and usually not something you would want, to cover all topics in one year. Long-term results are created by carefully crafting a plan to build the security culture you want over the course of several years."  
True, for two reasons: (1) given a broad perspective on information risk and security, there are lots of topics to cover, hence a lot of information to impart; and (2) cultural changes are inevitably slow. People need time to receive and internalize information, and change their ways. They need gentle encouragement and support, motivation and, in some cases, enforcement of the security rules.
"Some topics are relevant at different stages of an employee lifecycle. One example is introducing new employees to policies and regulations when they begin working. Another is during relocation, when it may make sense to train the employee in local security routines."
The need to include information risk and security in induction or orientation training is obvious, no problem there. Relocation, though, is not a strong example: in 'employee lifecycle' terms, what about internal moves and promotions, and eventually leaving the organization?  Those are almost universal activities that do indeed have information risk and security implications that the awareness program might usefully cover. Hmmm, perhaps we should put that idea into practice with the awareness materials. We already cover some aspects (such as periodically reviewing and adjusting workers' information access rights).

Some of the advice in SCF has become lost in translation e.g.:
"To map down topics that builds up under goal and matches an organizational map is one method to get a good overview. The easiest one is those who targets the whole organization and builds up under the overall goals in the goal hierarchy. Those who only target segments of the organization demands mostly more work."
Que?

SCF mentions a few forms or styles of awareness and training - mostly training in fact, with an emphasis on computer methods. 


4. Planner: plan and execute

SCF's advice in this area is straightforward and conventional, quite basic though helpful for someone just getting into security awareness for the first time, or at least the first time in a structured, planned way. 

Aside from defining goals, audiences and topics, and establishing metrics, there's little discussion of project or program management as a whole, including (1) risk management (what are the risks to your awareness program? What could go wrong? What should you be doing to mitigate the risks? And what about opportunities? Can you seize the opportunity and take advantage of business/organizational situations, or for that matter novel information risk and security situations such as the recent ransomware outbreaks, and forthcoming changes in privacy as a result of GDPR?); (2) resource management (e.g. recruiting, training and developing the awareness team, plus the extended team taking in those awareness ambassadors mentioned earlier); and (3) change management (it's ironic that change is noted earlier in SCF but not in the sense of managing changes to the awareness program itself - aspects such as changes to management support and perceptions, personnel changes, changes of focus and approach as old ways lose their impact and new ideas emerge, maturity, and changes prompted by the security metrics).


Conclusion

SCF has some good points, not least focusing attention on this important topic. The advice is fairly basic and not bad overall, although the sequencing and reference to other approaches is a bit muddled and confusing.

Of more concern are the omissions, important considerations conspicuously absent from the website's overview of SCF e.g. business value, psychology, adult education, compliance, motivation and maturity. I'm disappointed to find so little discussion of security culture per se, given the name of the framework: it mostly concerns the mechanics of planning and organizing security awareness and training activities, barely touching on the before and after stages. Perhaps Kai's training courses go further.

That said, both the Security Culture Framework website and Kai's book "Build a Security Culture" are succinct, and patently I have been sufficiently stimulated to write this critique. I prefer Rebecca Herold's "Managing an Information Security and Privacy Awareness and Training Program" but you may feel differently. There's something to be said for getting to know both of them, plus other approaches too such as David Lacey's "Managing the Human Factor in Information Security" - another excellent book.

Monday 11 September 2017

Security culture



Last night we watched a documentary on the History Channel about 9-11 - a mix of amateur and professional footage that took me back to a Belgian hotel room in 2001, watching incredulously as the nightmare unfolded on TV. Tonight there are more 9-11 documentaries, one of which concerns The War On Terror. As with The War On Drugs and The War On Poverty, we're never going to celebrate victory as such: as fast as we approach the target, it morphs and recedes from view. It's an endless journey.

The idea of waging war on something is a rallying cry, meant to sound inspirational and positive. In some (but not all) cultures it is ... and yet, in a literal sense, it's hard to imagine any sane, level-headed person truly relishing the thought of going to war. According to Margaret Atwood, "War is what happens when language fails", in other words when negotiations fail to the point that violent action is perceived as the best, or last remaining, option.

In truth, The War On Whatever involves more than just violent action: the negotiations don't stop, they just change. In public, they evolve into rhetoric and propaganda, fake news and extremism intended to elicit deeply emotional responses. In private, there's the whole issue of reaching agreement, defining the bottom line, stopping the untenable costs, saving face and redefining the boundaries.

National cultures and attitudes towards war and safety go way beyond the remit of our awareness service, and yet the corporate security culture has its roots in human perceptions, beliefs, ethics and moral values. We're unlikely to make much headway in changing those, although that alone needn't stop us trying! Hopefully we can influence some attitudes and hence some behaviors, perhaps drawing on cultural cues as part of the process.

There's plenty more to say on security culture as we work our way through the month: I promise future episodes will be less jingoistic and more upbeat. 

Friday 8 September 2017

Security certification

Aside from the elevator pitch, another short awareness item in our newly-revised Information Security 101 module is a course completion certificate, simply acknowledging that someone has been through the induction or orientation course.

I say 'simply' but as usual, there's more to it.

For a start, some of us (especially those who consider ourselves 'professionals') just love our certificates: our qualifications and the letters before/after our names mean something to us and hopefully other people. This is a personal thing with cultural relevance, and it's context-dependent (my 30-year-old PhD in microbial genetics has next to nothing to do with my present role!). My even older cycling proficiency certificate is meaningless now, barely a memory, but at the time I was proud of my achievement. Receiving it boosted my self-esteem, as valuable a benefit as being able to demonstrate my prowess on two wheels. I'm tempted to use Cprof on my business cards just to see if anyone reads them!

On the other hand, a certificate indicating a pass mark in some assessment or test can be misleading. The driving test, for example, is a fairly low hurdle in terms of all the situations that a driver may have to deal with over the remainder of their driving career. There is clearly a risk that a newly-certified and licensed driver might be over-confident as a result of passing the test and going solo, a time when accidents are more likely hence some countries encourage a subsequent period of driving with special P-plates (meaning probationary, or passed or potential or ...) in the hope that others will give new drivers more space. In risk terms, there are risk-reduction benefits in letting new drivers continue to hone their new-found skills, offsetting the increased risk of incidents.

In the same way with the InfoSec 101 course completion certificate, we're glad to acknowledge the personal achievement and boost people's self-esteem (yay - something positive associated with information risk and security!), although there is a risk they might believe themselves more competent in this area that they truly are. On balance, we'd rather deal with that issue, in part through the ongoing security awareness activities that delve deeper into areas covered quite superficially in the 101 module, across a broader range of topics, and partly through the corporate support structure, processes  - the security culture that will be covered in next month's awareness materials.

Perhaps at some later point well after induction, it might be appropriate to test workers again then issue the equivalent of those advanced driver certificates, accompanied with benefits analogous to lower insurance premiums? We include awareness tests in every module, so it's certainly feasible to track their scores and reward the star performers. There's even a rewards menu in the 101 module, complete with bronze, silver and gold-level certificate ideas, among many others.

Notice the emphasis on positivity and reward. We'd much rather focus on those who pass and succeed, than those who fail. Let's be frank here, failing something as basic as an InfoSec 101 awareness test (or driving test!) is really bad news, perhaps even justifying dismissal of new workers at the end of a probationary period. Such a hard line is something organizations might consider appropriate or necessary, especially in industries where information risks are substantial (e.g. defense, critical infrastructure, finance, government, health and IT), but it's not part of our remit. Personally, I would find such an approach unacceptable: instead I'd rather settle for remedial one-on-one training and limiting access to information until a passing grade is attained. To be honest, I'm more comfortable passing the buck to local management and HR in such delicate areas, especially given the employment law compliance implications.

There's another aspect to the 101 course completion certificate, concerning the award issue process itself: we provide a form letter to be sent along with the certificate by or on behalf of the CISO, ISM or some other appropriate manager. Most of all, it's an opportunity to re-emphasize that newcomers are integral, valuable parts of an organization that proactively protects and exploits information. Encouraging further contact between workers and the Information Security function bolsters the social network, directly supporting the oft-espoused but generally vacuous line that "We are all responsible for information security".  Yes we are, but there's more to it than trotting out some trite line on a poster or policy.

By the way, that's NOT our certificate imaged above. Ours is more classy, more refined, more attractive, more valuable. At least we think so. Aside from the execution, the concept is invaluable. And now it's yours. 

Wednesday 6 September 2017

Passwords are dead



I've blogged about passwords several times. It's a zombie topic, one that refuses to go away or just lie down and die quietly.

On CISSPforum, we've been idly chatting about user authentication for a week or so. The consensus is that passwords are a lousy way to authenticate, for several reasons.

First the obvious.  Passwords are:
  • Hard to remember, at least good ones are, especially if we are forced to think up new ones periodically for no particular reason;

  • Generally weak and easily guessed, due to the previous point;

  • Sometimes generated and issued not chosen or changeable by the user;

  • Readily shared or disclosed (e.g. by watching us type), or written down;

  • Readily obtained by force, coercion, deception and other forms of social engineering such as phishing or password reset tricks, or interception, or hacking, or brute force attacks, or spyware or .. well clearly there are lots of attacks;

  • Often re-used (for different sites/apps etc., and over time).
Next comes some less obvious, more pernicious lousiness:
  • Badly-designed sites/systems sometimes prevent us using strong passwords (e.g. they must be less than 20 characters with no spaces nor special characters ...; must be typed or clicked manually - no automation allowed);

  • Poor guidance on choosing passwords encourages poor choices;

  • Passwords are sometimes weakened covertly by even lousier sites/systems (e.g. we can enter complex 50 character passwords but they only actually use 6, or store them in plaintext, or use a pathetically weak or broken hashing algorithm, often without a salt ...).
In short, passwords are not a reliable way to authenticate people. As a security control, they are weak to mediocre at best, not strong ... which is obvously a concern when authentication really matters. Some sites and apps have moved to multi-factor authentication, generally passwords or PIN codes plus some other factor, such as a cryptographic token, 'bingo card' or some other piece of hardware, or software, or biometrics, or locational information (e.g. GPS coordinates) or system characteristics (operating system + IP address or IMEI).

Passwords are dead
Long live passwords

Martin from Sweden has been telling us about an interesting federated authentication system there called BankID, based around a mobile app. The app serves credentials to various Swedish organizations enrolled in the scheme, not just the bank that originally authenticated the user (using a hardware token). It allows the user to check the details at authentication time (e.g. the transactions you are authorizing). It is multifactor: you need PINs or passwords to access your mobile and the app, plus the app, plus the device, plus the keys. Presumably it has mechanisms to handle lost/stolen mobiles, and new mobiles.

It's a successful, working system, not just a model or theory.  Cool!

I'm still interested in the idea of continuous authentication, supplementing the conventinal one-time login process at the start of a session with user activity monitoring during the session to confirm that the logged in user is behaving normally, and has not suddenly started typing differently, accessing different apps and sites, gambling, making large payments to Swiss bank accounts or whatever. 

Monday 4 September 2017

InfoSec 101 elevator pitch, final part

Moving on from our discussion of the first two paragraphs of this month's elevator pitch paper, here's the closing paragraph:
As a manager, you play a vital governance, leadership and oversight rôle.  Please make the effort to engage with and support the security awareness program, discuss information risk and security with your colleagues, and help us strengthen the corporate security culture.
In classical marketing terms, it's the call-to-action for people who have been lured and hooked. Having presented our case, what do we actually want them to do?  

Compared to the preceding two, the third paragraph is quite long. 

While we could easily have dropped the first sentence, it serves a purpose. It shows deference to the management audience, acknowledging their influential and powerful status, gently reminding them that they are expected to direct and oversee things. Essentially (in not so many word), it says "Pay attention! This is an obligation, one of your duties as a manager."

The final sentence, including those three words in bold, was especially tricky to write for the Information Security 101 module. What is it, exactly, that we expect senior managers to do in relation to this very broad introductory-level topic? Think about that question for a moment. There are many possible answers e.g.:
  • Show leadership
  • Demonstrate commitment
  • Support the Information Security Management System (in an ISO27k organization)
  • Get actively involved in information risk and security management activities, such as risk assessment and risk treatment decisions
  • Raise the profile and priority of information risk and security matters
  • Provide adequate resources to do this stuff properly for once (!)
  • Encourage or enforce compliance
In the end, we settled on asking managers to demonstrate their 'support' in a non-specific way. In practice, that would vary between individual managers in various business units or departments. The call-to-action is context-dependent and hence very difficult to specify without an understanding of the audience and their situation, which we don't possess at the point of writing the awareness materials. It should be clearer when the messages are being delivered, and obviously we hope they make enough sense to resonate and influence the audience's decisions and behaviors, otherwise awareness is a pointless exercise.

In other awareness modules, the closing message for the elevator pitch is usually more obvious in that we focus the spotlight on distinct areas of information risk and security each month. For instance, in August's awareness module on cyberinsurance, the elevator pitch ended with a thought-provoking question: "Without cyberinsurance, serious cyber incidents could prove devastating if they occurred: we would save the insurance premium but is that a gamble worth taking?". The call-to-action was implicit rather than explicit. Our words deliberately raise a doubt. We couldn't simply say "Buy cyberinsurance!" as that may be inappropriate and unnecessary for some customers, not least those who already have it. Although more explicit, something along the lines of "Consider taking out cyberinsurance" would have been bland, lame and pathetic. "Is that a gamble worth taking?" is more of an intellectual challenge. In fishing terms, we're trying to get a rise out of the audience.

This month, we've deliberately sown the seed for next month's awareness module on 'security culture'. There will be much more to say, expanding those three bold words into an entire awareness topic. Linking the awareness topics together like this is yet another way to form a series of discrete awareness items into a coherent program, in turn supporting the security culture. 

So, there you go. Over three blog pieces, it has taken me about a thousand words to explain a hundred. Has it prompted you to think differently about management-level security awareness? 

I think it's obvious why short awareness items can take a disproportionate amount of effort to compose. The end result has very few words, but they are very carefully selected for maximum impact and value.

Cue Blaise Pascal:
"Je n'ai fait celle-ci plus longue que parce que je n'ai pas eu le loisir de la faire plus courte."
which Google translates as:
"I have made it longer only because I have not had the leisure [time] to make it shorter."
If you don't have the time and energy to prepare security awareness content, leave it to us! It's what we do - more than just a job, it's our passion.

Sunday 3 September 2017

InfoSec 101 elevator pitch, part 2 of 3

Yesterday, I started telling you about one of the smallest deliverables in our awareness portfolio, the elevator pitch aimed at senior executive management. Despite its diminutive size, a lot of effort goes into selecting and fine-tuning those 100-odd words.

[Sorry if this detailed deconstruction of the pitch one paragraph at a time is tedious but I think it's useful to understand the design, the purpose of the page and the thinking that goes into it. As far as I know, we are the only security awareness provider specifically targeting senior management in this way. I've made disparaging comments in the past about awareness programs aimed at "end-users": neglecting other employees - especially managers and professionals - seems incredibly short-sighted to me, a bit like trying to teach the passengers how to drive a car, ignoring the driver and the mechanics.] 

OK, pressing swiftly ahead, the elevator pitch can be interrupted at any point. If someone is presenting or talking it through with an exec, they may well need to break off answer questions or respond to comments. If a busy exec is quickly skimming the piece online or on paper, they might get distracted by a phone call or email. We may only have their attention fleetingly, if at all. 

If we're lucky, the exec will swallow the bait and be hooked ... so the second paragraph has the essential barb:
Cybersecurity is important but there’s more to it than IT. Information security enables the business to exploit information in ways that would otherwise be too risky.
'Cybersecurity' is all the rage, of course. It's a term we see frequently in the media.  Although it's rarely defined, it is generally interpreted as IT and network security, specifically around Internet-related tech incidents such as hacking and malware. That's all very well, but what about all the rest of information risk and security? What about social engineering scams and frauds, piracy, industrial espionage and so forth? What about the whole insider-threat thing: where does that fit in relation to 'cyber'? 

Oh, hang on a moment: explaining the first 10-word sentence of the second paragraph took me about 100 words. Admittedly my explanation rambles on a bit, but on the other hand it's still just the tip of the iceberg.

The sentence that ends the second paragraph again mentions "business" - quite deliberately so but did you even notice? Down here in New Zealand, we are suffering a spate of intensely annoying radio advertisements that inanely repeat some key word that, I presume, the client asked the ad agency to promote. "Wallpaper" is one that springs to mind, repeated about a dozen times in a typical 30 second ad. Maybe they think they are being clever because here I am talking about their wallpaper advertisement, but the repetition is so distracting that I can't remember the rest of the ad, including the company or product names. I reflexively hit the off button whenever I catch the first few seconds!

Rant aside, our second paragraph throws down a challenge before the reader. It's deliberately open-ended and thought-provoking. If cybersecurity is more than just IT, what else is it?  How does information security enable the business, and what's all this about risk anyway?

Tomorrow I'll conclude this little series by blogging about the final paragraph. Are you on the hook?