Saturday 30 July 2011

Disclosing our sources

These are some of the key resources we use routinely to find out about and learn from information security incidents:
  • Google, of course.  We search often using the Google toolbar in our browser.  We have learnt to craft more effective queries by exploiting Google’s search syntax including the advanced search functions

  • Google Alerts are a helpful way to trawl the Web daily for specific news and tidbits relevant to the monthly topics, especially since we discovered how to integrate alerts into our RSS/blog reader …

  • Google Reader is, currently, our RSS/blog reading weapon of choice.  Have you spotted the not-too -subtle pattern here?  Google rocks! 

  • Hyperlinks embedded within other sources.

  • Blogs, particularly information security blogs from information security gurus and respected tech journalists, but sometimes we enjoy naïve or counter-cultural blogs, even those from the Dark Side, the hacker underground (as in ‘know your enemy’!).

  • Academic and trade journals, such as EDPACS, ISSA Journal and (ISC)2 Journal.

  • Industry associations, meetings and peers.

  • Magazines such as Hackin9 and ClubHACK.

  • General news media – yes, even TVNZ, the BBC, CNN and others occasionally highlight information security incidents or issues that haven’t already come to our attention elsewhere, albeit rather superficially.

  • Information security surveys such as those from Secunia, CSI and PwC (including the biannual breaches survey).  While these sometimes describe interesting incidents, they tend not to be very recent.  Surveys are of more use for their information about information security threats.
What do you use?

Thursday 28 July 2011

Learning from information security incidents

Information security incident management processes are meant to help the organization contain and recover more efficiently from incidents.  Well-designed processes also enable the organization to understand the risks that materialized, analyze and identify the root causes, and make improvements to the security controls in order to reduce the risk of further incidents.

The School of Hard Knocks is an effective but rather brutal institution.  We can certainly learn from the information security incidents we suffer directly, but they can be costly - devastating even.  The worst can literally threaten the organization’s survival.  Hard knocks indeed!  

The awareness materials this month extend the idea of learning from our own information security incidents to take in lessons from incidents affecting third parties.  The idea is to gain the knowledge without actually suffering the adverse impacts of information security failures. 

It’s obvious when you think about it, but does your organization do this systematically?

Tuesday 19 July 2011

On being 'secure enough'

Security Week invites readers to complete a checklist/questionnaire to figure out whether their security awareness programs are "good enough".  I was pleased to rate myself in the top-scoring category:
"If you scored 55 or more “yes” answers, you already know this stuff and have yourself under control. You could probably be teaching other organizations how to design and implement security awareness programs. You have a well-defined and executed program that pretty consistently exceeds standards of due care. Maintain your program and stay vigilant on quality updates."
Well yes, in a sense I am 'teaching other organizations how to design and implement security awareness programs' through our awareness service so the high score is to be expected. In fact, we deliver rather more than the checklist requires*, but it got me thinking about whether it is realistic to expect our customers, or indeed less fortunate organizations :-) to adopt all the awareness practices and topics mentioned in the checklist, or in books such as Rebecca Herold's Managing an Information Security and Privacy Awareness and Training Program.

The reality is that the range and scope of awareness programs varies enormously, depending on factors such as:
  • The level of management support for information security and/or awareness;

  • The energy, enthusiasm and drive of the person or team driving the awareness program, plus their own preferences, expertise and experience;

  • The maturity of the awareness program, and its perceived value and effectiveness to date;

  • The breadth of information security issues facing the organization.
A few organizations are either not doing any security awareness, or are stuck in the groove of annual 'awareness training sessions' or begrudging, minimal compliance with their legal and regulatory requirements which is frankly not much better.  As the checklist author put it "To put it bluntly, you are probably an accident waiting to happen."

I struggle to understand how management expects the organization to be secure if it fails to inform and motivate its employees on security matters.  It's a curious form of myopia/blindness.  Perhaps these same managers put all their faith in antivirus and firewalls ... right up to the point that they are hit by one massive security incident (a la RSA) or a string of (slightly) smaller ones (Sony-style).  Meanwhile, they are slowly being bled dry by the background noise of information security incidents which nobody notices or cares about.  What a waste!


* We cover a wider choice of information security topics, with a broader range of awareness materials, and last but not least we create awareness materials for IT professionals as well as for general employees and managers.  What do you do?

PS  Aside from the differences between organizations, different parts of an organization may be at different stages of maturity with respect to information security and/or security awareness. And it's a dynamic, fluid situation - for example levels will be higher soon after a major incident or event than before.

Monday 18 July 2011

Unclassified but still worth protecting

An unusual news item in the Federal Times says that the US DoD is proposing to impose information security requirements on defense contractors regarding unclassified information, supplementing those for classified information.  The article goes on about blurring the distinctions between classified and unclassified information, and claims the compliance costs across the industry will be enormous, but if so I'm puzzled at the implication that such information is not already being adequately protected by contractors.  Surely any organization that handles classified military information is well aware of information security risks and controls, so I would be very surprised if unclassified information is as insecure as the journalist suggests.

Thursday 14 July 2011

Cross site scripting made simple

A well-presented video tutorial from the OWASP team explains in simple terms how one form of XSS - cross site scripting - works.

XSS is a bit tricky to explain.  The video makes good use of graphics to put the message across, without getting too technical.

If you are a web developer, you should be well aware of XSS, in sufficient depth to know how to prevent this form of attack on visitors to your websites.  The tutorial barely hints at the technical controls needed but future editions will go into more depth.  Meanwhile, the excellent OWASP site includes lots more information and even some code snippets to give you a head start on securing your site.

Tuesday 12 July 2011

You have the right to remain silent ...

... while we force you to enter your passphrase into your computer to decrypt the data potentially comprising or incriminating evidence. According to the cNet article:
"Prosecutors stressed that they don't actually require the passphrase itself, meaning Fricosu would be permitted to type it in and unlock the files without anyone looking over her shoulder. They say they want only the decrypted data and are not demanding "the password to the drive, either orally or in written form."
The ramifications of governments 'allowing' 'ordinary' 'citizens' access to strong encryption are many and varied. What if citizens have the nerve to protect information which they consider highly confidential but which the government desires to access? Of course the government has the resources to try to defeat the cryptosystem, whether by brute-force attack or cryptanalysis. It also has the resources and means to attempt to steal passphrases using Trojans or other surveillance techniques, or insert and access backdoors, or insist on escrow. We know it has the rubber hose necessary for coercive cryptanalysis. And if it had the means to read citizens' minds, you can bet it would apply them. But for now, being forced to go through the courts to demand that citizens decrypt their own information for the benefit of the government (and, arguably at least, for society at large) is, for me, a step too far. 

Just like the so-called rule of law "innocent until proven guilty", I accept that some guilty parties will 'get away with it' if their crypto-secrets are in fact strong enough to remain secret, but on balance this is better than the alternative. If the government has the legal right to demand that its citizens incriminate themselves, the government cannot also demand the support of its citizens - the very citizens who give it the authority and power to act on their behalf. 

George Orwell saw it coming.

Sunday 3 July 2011

Changing the culture of an entire industry

Engendering a culture of security is something we normally talk about in relation to organizations and parts thereof (for example, changing the culture within management or within the IT department).  I'm sure that most people who have actually tried to do this would agree that it's a tough challenge.  It's not even entirely obvious how to define, let alone influence or change corporate cultures. It's one of those things that is easier to say than to do.

OK, now imagine your task is to engender a culture of security across a massive public body - like for example the UK's National Health Service.  According to a piece in SC Magazine, the Information Commissioner is calling for changes in the NHS:
“The sector needs to bring about a culture change so that staff can give more consideration to how they store and disclose data. Complying with the law needn't be a day-to-day burden if effective measures are built in and then become second nature."
Actually, the quote is a bit ambiguous regarding the scope: is the Commissioner concerned with just the NHS or the sector - presumably the health sector in the UK?   Either way, changing the culture is a massive undertaking.

He continues:
“My office is working with Connecting for Health to identify how we can support the health service to tackle these issues.”
I looked through the Connecting for Health website to see what they have to say about information security or privacy, and initially found nothing obvious until I came across the Information Governance section (hint: governance is not normally a synonym for security, but the NHS seems to be developing its own parallel language, for example referring to Serious Untoward Incidents, or SUIs, where plain old 'incidents' would normally suffice).  There I discovered some red tape to request access to the NHS network, out-of-date and inaccurate information about the "ISO 27000 series of standards" (that's ISO/IEC 27000), a "detailed 17-page document explaining the background and development of both patient and clinician 'sealed envelopes' functionality" plus, of course, a PowerPoint presentation to explain the 17 pages (!), a vague introduction to information security and various other bits.

Overall, the website leaves a poor impression regarding information security. The information is disjointed, minimalist and full of jargon, so that's one area in which the Information Commissioner can usefully apply pressure supporting the cultural change he anticipates. A coherent, accessible, useful and engaging website would be a worthwhile vehicle for a security awareness program.