Information risk assessment (reprise)
On ISO27k Forum this morning, an FAQ made yet another appearance. SR asked:
"I am planning to do risk assessment based on Process/Business based. Kindly share if you have any templates and also suggest me how it can be done."
Bhushan Kaluvakolan responded first by proposing a risk assessment method based on threats and vulnerabilities (and impacts, I guess), a classical information-security-centric approach that I've used many times. Fair enough.
I followed up by proposing an alternative (and perhaps complementary) business-centric approach that I've brought up previously both on the Forum and here on the blog:
- Consider the kinds of incidents and scenarios that might affect the process, both directly and indirectly. Especially if the process is already operating, check for any incident reports, review/audit comments, known issues, management concerns, expert opinions etc., and/or run a risk workshop with a range of business people and specialists to come up with a bunch of things – I call them ‘information risks’. This is a creative, lateral thinking process – brainstorming. Focus on the information, as much as possible, especially information that is plainly valuable/essential for the business. If necessary, remind the experts that this is a business situation, a genuine organizational concern that needs pragmatic answers, not some academic exercise in precision.
- Review each of those information risks in turn and try to relate/group them where applicable. Some of them will be more or less severe variants on a common theme (e.g. an upstream supply chain incident can range from mild e.g. minor delays and quality issues on non-critical supplies, to severe e.g. sudden/unanticipated total failure of one or more key suppliers due to some catastrophe, such as the Japanese tsunami). Others will be quite different in nature (e.g. various problems with individual employees, IT systems etc.). A neat way to do this is to write each risk on a separate sticky note, then stick them on a white board and briefly explain them, then move them into related/different groups of various sizes and shapes.
- Discuss and evaluate each (grouped) risk according to its probability (or possibility or chance or likelihood or frequency … or whatever) of occurrence, and the organizational impact (or severity or criticality or trouble or nastiness or scale or cost or size or drama … or whatever) if it ever does occur. Plot them out on a PIG (probability-impact graph). There are several examples here on this blog, plus tips on running risk workshops etc. Instead of the ‘whiteboard’ noted above, those ‘sticky notes’ could be text overlaid on a colourful blank PIG graphic, pre-drawn on a computer screen in, say, Powerpoint or Visio.
- Once all/most of your identified risks are on the PIG, and you have had a good chance to discuss their wording and positioning and relationships, set aside some time to focus on any in the red zone i.e. severe + high probability risks: these are clearly priorities for the business. What can/should be done to treat them? What needs to be put in place to enable the risks to be treated? Who needs to drive that work (the ‘risk owner’)? How will the resources be found and allocated? When does it need to happen, and how? Continue with the orange zone risks, and the greens too if you are obsessive and have the time and energy (are there existing risk treatments/controls for the greens that might safely be relaxed or retired?). This generates a draft action or risk treatment plan, prioritized according to information risk. Look for opportunities to schedule and align activities where it makes sense, including other parallel activities where applicable (e.g. linking process changes with IT system or supplier changes, business reorganization etc.).
- Check for any outliers, anomalies, and open issues. Looking at the whole PIG, is there anything that seems odd, or wrong, or worrying? Take an even broader business or strategic perspective: how does this PIG and this set of information risks fit in with other PIGs and other risks facing the organization? Are there issues and constraints in this area that often crop up in other areas too, hinting at a common cause that maybe ought to be tackled too? Again, are there opportunities to hook-on to other business initiative, projects and activities? And how does all this align with and support business objectives?
- As actions/risk treatments are completed, several information risks on the PIG should move as they become less likely and/or severe – so review, update and reconsider the PIG periodically. Look especially hard for changes such as new or emerging information risks that aren’t yet represented on the PIG. Relevant incidents and near-misses that aren’t adequately reflected in identified risks indicate omissions in your risk identification and assessment process … so look for others too, and make improvements. If necessary, run focus group sessions to address information risks that remain stubbornly stuck in the orange or red zones. If risk treatments aren’t working, what needs to be done to fix them? Are there alternative approaches worth trying? Are there competing priorities or constraints that management needs to address … or are the risks acceptable, in fact (if so, get that in writing! Hold someone senior personally accountable for the risk acceptance decision)?
- Keep notes on the risk management process, the workshops, techniques, issues etc. and refine the approach every time it runs. [That’s how I got here, and my journey to enlightenment continues!]
The PIG part of this approach is especially controversial, I know. There are other forms of risk analysis, including truly quantitative approaches (based on actual data and mathematically sound models) and other qualitative methods … but I find this good enough for my purposes, and simple enough that, once they get the hang of it, workshop attendees focus on discussing and understanding and tackling the risks rather than obsessing about the analytical method. YMMV.
For truly business- or safety-critical situations, or if you are uncertain about whether any given approach is OK, you might try several different methods, comparing and contrasting the results for additional insight. Chris Hall has previously suggested involving different groups of people in separate sessions to emphasize their different perspectives, expertise and interests (cool tip - thanks Chris!). It’s hard, though, to bottom-out the reasons for the differences, not least because this is all based around predicting an inherently uncertain future. It’s all crystal ball gazing. This is closer to witchcraft and alchemy than science. There comes a point where it’s better to just get on with it and see how things go, than to continue endlessly refining the analysis or obsessing about the methods. You can always come back later for another gaze, another go at mixing your magic potions. Meanwhile, those risks need treating, the red ones urgently.