Thursday 30 June 2016

Cryptography - our security awareness topic for July

Cryptography gives us powerful and yet fragile information security controls. 

Strong confidentiality and authentication mechanisms are wonderful provided they are well designed, implemented, used, managed and maintained … but cryptographic controls have a nasty tendency of failing open, sometimes becoming spectacularly insecure - which is just one of the information risks associated with cryptography. 

Since this is ‘only’ a security awareness module, we’ve avoided delving into the advanced mathematics that underpins cryptography, while at the same time giving enough information for the module to be both interesting and actionable. Cryptography is a complex, technical topic, for sure, but that's no reason for the awareness program to ignore it and hope for the best!

Even if you have the expertise and interest to research and prepare your own awareness materials, wouldn't you rather spend your valuable time interacting with your colleagues, spreading the word about information security and helping them see the light?

Talking of spending time in the organization, the train-the-trainer guide in the module offers guidance on how we envisage the materials being used, and offers a bunch of creative ideas to make your awareness program more interactive and, yes, fun. This month, there are some “crypto-toys” for workers to explore basic encryption mechanisms, hands-on, and the chance to mess around with medieval-style wax seals, not unlike those on our awareness posters every month.  More than simply a design touch, they are a subtle historical reference to a physical form of information security, a tip o’ the hat to our predecessors.

Tuesday 28 June 2016

ISO27k conference in San Francisco, end of Sept


It's a 2-day conference plus optional workshops the day before and training courses afterwards, in the final week of September at a smart purpose-built conference facility on the outskirts of San Francisco airport, not far beyond the boundary fence I think.  Standing speakers may need to duck, and shout.

There will be sessions on:
  • ISO27k basics
  • ISO27k implementation
  • ISO27k for cloud security
  • Integrating ISO 22301 (business continuity) with ISO27k
  • ISO27k metrics …
and more.

Walt Williams of Lattice, Richard Wilshire (ISO/IEC JTC1/SC27 project leader for the total revamp of ISO/IEC 27004 on “Monitoring, measurement, analysis and evaluation”), and Jorge Lozano from PwC are all presenting on metrics at the conference, and FWIW me too.  I’m hoping to persuade Krag Brotby to attend as well.   

Aside from the conference sessions, it is lining up to be The Place for security metrics newbies and wise old owls alike to put the world to rights during the coffee breaks, maybe over a meal, and then inevitably at a nearby airport hotel bar until the wee small hours.  Should be a hoot.

Saturday 25 June 2016

Information risk - the Next Big Thing

It strikes me as deeply ironic when a peer acknowledges that the most important thing in cybersecurity is not the technology but the people. The irony is deeper still, given that the comments stem from a Gartner conference.

Anyway, I see a feint glimmer of hope that, finally, the cybersecurity bandwagon might be trundling out of town. And good riddance! Frankly, I'll be glad to see the back of it. Cybersecurity may be a gigantic feeding trough but it is so 20th Century.

Way back in the 80s when I started my professional career, "computer security" was just becoming the thing. The reasoning was simplistic: computing was a costly and new/risky investment that had to be protected. However as mainframes gave way to minis and then micros, mainstream IT gradually became a humdrum commodity. Admittedly, there is still competitive advantage to be gained by strategic investments in IT, including old-school systems and software development (as opposed to merely assembling and configuring commercial off-the-shelf products including cloud-based services). But how many present-day organizations have their own in-house application development function?

Through the 90s and 00s, we've surfed the waves of "IT security", "GRC" (governance, risk, compliance) and now "cybersecurity" ... and yet it is the information that I seek to secure, not (just) the cyber. Once again, the reason is simple: there's more value in the information being processed than in the fancy electronics doing the processing. There's much more at stake if the information is threatened than if the technology is under pressure. We can always pop down the road and buy another box. Securing the machines per se doesn't necessarily protect the information, especially if you realise that a substantial amount and value of business information is never even computerised in the first place (a fact that the cybersecurity crowd either remain blissfully ignorant of, or conveniently choose to ignore).

Worse still, we're not even much good at securing the technology. Ransomware is the latest in a long line of demonstrations of our collective ineptitude, and perhaps our arrogance. The controls against ransomware are basic, and yet there are evidently vulnerable victims a-plenty.

For what it's worth, I predict the Next Big Thing after cybersecurity will be "information risk" by which I mean "risks associated with or arising from information". Since information is widely acknowledged to be an extremely valuable, if not invaluable business asset these days, the related risks deserve to be properly addressed, making this very much a business issue. The linkage is direct and obvious. Now that's a bandwagon I'd clamber aboard - in fact I might even be driving it.

Following that, the next Next Big Thing might just involve "opportunity". Information security and cybersecurity professionals are, on the whole, obsessive about mitigating risks, as if risk is inherently evil, something that ideally ought to be eradicated or at least understood and brought under control. The idea that some risks might actually be good and beneficial, to be embraced and willfully exploited, is anathema to most of my peers. There are glimpses of an alternative approach but they are rare indeed: I'm thinking, for instance, of the possibility of deliberately exaggerating the quality and strength of our defenses as a deterrent control in its own right, perhaps faking the messages indicating that our Internet-facing systems are fully up to date with all the current security patches. Furthermore, ethics and legalities aside, the use of penetration testing, social engineering, malware, cryptanalysis and so forth as offensive weapons to compromise competitors or other adversaries is simply not discussed in polite conversation between infosec pros. You don't find mainstream conference speakers extolling the virtues of building offensive cyber capabilities, except perhaps in the military and defense world. Compliance is so deeply ingrained in us that few would even consider delaying let alone lying about compliance with assorted security, privacy, governance and other obligations.

That's all very well but what if our adversaries have never even heard of the Marquess of Queensberry? We're not even taking a knife to a gunfight: all we have are our secret decoder rings, plywood swords and cardboard shields. And let's face it, we look ridiculous in our blue Spandex suits and face masks ("Speak for yourself Gary. I look cool!").

PS  Although 'information security risk' is mentioned in many ISO27k standards, the term is not defined as such in ISO/IEC 27000:2016

Tuesday 21 June 2016

Another one bites the dust ...

My PC has been going steadily downhill for the past week or two, until finally at the weekend it plummeted off the cliff's edge into the deep blue.

The symptoms were confusing: it would freeze up randomly, sometimes thawing and sometimes slowing to a crawl but occasionally becoming totally unresponsive, requiring a reboot. There were no error messages, at least none that I noticed. The Windows error log was no help, and there was no obvious pattern to it. I couldn't pin it on any specific app or situation - even reverting a recent software update on an app that I run 24x7 made no difference. There are no reported viruses. The PC isn't overheating and the mains supply is reliable.

Well, with 20/20 hindsight, there were some little clues about the underlying cause. Saving fairly large files sometimes took a bit longer than normal due to the PC pausing for breath in mid-save. MP3 music would sometimes stutter, endlessly repeating a few seconds like a scratched vinyl record or a very talented parrot. 

Defragmenting the disks with the Windows built-in function or with Piriform's Defraggler utility (which, I suspect, is just a pretty user interface layered on top of the self same Windows tools) didn't help, and in fact one of the disks refused to defrag fully ... so, suspecting a disk problem, I tried CHKDSK and took a look at the S.M.A.R.T. reporting. Neither seemed to indicate anything wrong, although the S.M.A.R.T. parameters aren't exactly simple to understand. I guess I'm just not sufficiently familiar with the normal values to spot something out of the ordinary. Does any of this look bad to you?


For instance, the S.M.A.R.T. 'Read Error Rate' on this particular disk has a 'real value' of zero, but a 'Current' value of 200 which appears to be the worst (worst ever, I guess, presumably for the lifetime of the drive) ... and a 'Threshold' value of 51. So is the 200 or zero good news or bad? There is no red flag, nothing but the merest hint that the drive might perhaps be about to leap off a handy cliff. The other S.M.A.R.T. parameters are just as confusing. Have I really only power-cycled this disk 7 times? Somehow I doubt that. I'm just not smart enough for S.M.A.R.T.]

The final clue to my failing hardware came when Word steadfastly refused to open a large (250-page) document that I had been working on lately. The file looked normal in Explorer but Word stopped opening it about a third of the way through the progress bar, complaining that it was corrupted.

Oh oh.

Naturally, being an infosec pro and 'professionally paranoid', I have multiple backups of the disk in question, the most recent being about a week ago (I really should sort out a daily backup regime!) ... so I decided the best approach was to try to copy the dying disk contents to another drive and then hopefully restore any failed/corrupted files from backups. The disk copy started OK but a few minutes later the errors started coming thick-n-fast. Having selected 'skip' to ignore the corrupted files, the PC did its best to copy the remainder. Judging by the bursts of normal-speed copying interspersed with slow or dead-slow periods, the problem seemed to afflict various parts of the disk differently (perhaps a head crash or misalignment?). It turned what is normally a 30 minute job into an all-day marathon.

That was yesterday. Today I've been checking the transferred fiiles and recovering a few obviously missing ones from backups. All the apps I have tried so far seem to work fine, and my blood pressure is heading back down towards the normal range.

While looking for another disk to replace the dodgy one, I checked through our heap of SATA drives on the side, setting aside those marked "DEAD" or "DYING" in bold red marker pen. I now have two dead Western Digital 250Gb SATA 2 drives and two dead Seagate 1Tb SATA 3 drives: all have expired within the past couple of years or so. The Seagate Barracudas were particularly disappointing, failing much sooner than I expected for no obvious reason other than poor quality manufacturing, so I won't be buying or recommending Seagate ever again. The WD Caviar drives have done rather better, lasting about 7-9 years in daily use. The take-home message for me is not to expect more than about 5 years' life out of a decent disk, even less from Seagate.

Today I ordered two new WD Black 1 Tb SATA 3 drives. The "Black edition" drives are evidently tested more thoroughly than the blues, and come with a 5 year manufacturer's warranty - although it is a 'return to base' warranty which the cynic in me suspects means I'm paying extra for the dubious privilege of being a WD beta tester. I also ordered an external USB3 double disk caddy that can make disk-to-disk copies, presumably bit-copies or sector-wise duplication: that will come in handy to make belt-n-braces backups of my backup disks to store offline in the fire safe. Like I said, I'm professionally paranoid!

Aside from new hardware and, hopefully, a daily backup regime, I need to put more effort into monitoring and hopefully understanding those S.M.A.R.T. parameters. Maybe I ought to investigate RAID for its real-time data duplication, perhaps even cloud storage if our rural broadband can take the strain? A policy of retiring disks before they hit 5 years of age makes sense too, for the primary data disks anyway. Windows and apps can always be reloaded. The data are what counts. There's gold in them thar bits.

The real trick, of course, is to use this security incident and make those improvements. It's all very well me blogging about them: it amounts to nothing but good intentions unless I follow-through and complete the corrective actions. Hopefully, though, this little case-study has made you contemplate your own situation, your controls, your disks, data and backups, especially if you are a small business like us or a power user at home. You get the benefit without the costs and the blood pressure spike. Watch and learn.

Saturday 4 June 2016

Do not lift this cover


Having accidentally sent a journalist an ineptly redacted document, the Public Health Agency of Canada is - quite rightly - roasting uncomfortably in the glare of the media spotlight today:
"Raphael Satter, an Associated Press correspondent in Paris, was dumbfounded when he received files from the Public Health Agency of Canada that were censored using only Scotch tape and paper ... He was able to see the redacted confidential information simply by peeling back the paper."
There are at least 11 information risks or types of incident associated with redaction:
  1. Making bad decisions about the data to be redacted, the technical methods or process to be used and/or the suitability (primarily competency and diligence) of those tasked to do it;
  2. Failing to identify correctly all the sensitive data that must be redacted (both the individual data items and the files);
  3. Failing to render the redacted data totally unrecoverable, for example:
    • Using inappropriate or ineffective technical methods for redaction, such as crudely modifying rather than permanently deleting the sensitive data using methods that can be completely or partially reversed (for example simply reformatting or overlaying redacted text to appear invisible, or applying readily-reversed mechanistic transformations or tokenization of textual identifiers);
    • Accidentally leaving one or more copies of the sensitive data completely or partially unredacted (perhaps releasing multiple, independently and differently redacted versions of a sensitive document, enabling it to be reconstructed directly or by inference);
    • Partially deleting the sensitive data, leaving data remnants or sufficient information (such as the editing journal or cached copies) enabling the data to be restored from the redacted file;
    • Relying excessively on pixellation, blurring or similar methods of obfuscation to obscure parts of images (typically for personal privacy reasons), whereas deconvolution and other more or less advanced image manipulation/transformation techniques may restore enough of the original image to permit recognition;
    • Neglecting to redact sensitive metadata (e.g. in document properties or reviewer comments, GPS data on digital images, or alternate data streams);
  4. Failing to distinguish all redacted from non-redacted data, consistently and accurately, such that recipients know unambiguously which parts are no longer original;
  5. Excessive or inappropriate redaction, removing more than just the specific sensitive items that were supposed to have been redacted or doing so clumsily (which raises the prospect of having to justify redaction decisions and activities to a trustworthy intermediary or authority);
  6. Inappropriately or inadvertently altering the meaning of the remaining data as a result of contextual issues (e.g. deleting selected data records may invalidate statistical analysis of the remainder), or by causing collateral damage to the file structure (such as file integrity issues and inappropriate formatting changes) during the redaction process;
  7. Leaving sufficient data in the file to enable recipients to infer sensitive information, perhaps in conjunction with other available information sources (e.g. replacing people’s names with anonymous labels in a redacted file but separately disclosing the relationship between labels and names; disclosing anonymous statistical data on known small populations; disclosing the number of characters redacted, and perhaps even giving clues to the most likely characters by dint of their printed size; applying data mining, correlation and inference techniques to glean sensitive data from redacted or anonymized content);
  8. Placing excessive reliance on redaction, believing it sufficient to keep sensitive data totally confidential under all circumstances whereas technical and process failures are possible and incidents sometimes occur in practice; conversely, placing zero reliance on redaction, believing it to be totally incapable of protecting sensitive information (these are governance and assurance risks);
  9. Information security issues that are incidental or peripheral to the redaction process itself such as:
    • Sending the original files, redaction instructions, redacted content or indeed the redacted files to the wrong recipients;
    • Failing to secure information relating to the redaction process, such as the original files or detailed redaction instructions, while in transit, during processing and in storage (e.g. interception of sensitive content in clear on the network);
    • Accidentally disclosing unredacted versions of the file, whether at the same time and through the same disclosure mechanism or separately;
    • Deliberate disclosure or ‘leakage’ of unredacted versions of the file without permission or inappropriately (e.g. to Wikileaks);
    • Accidentally or deliberately disclosing the redacted information by some means other than by releasing the digital data (e.g. by releasing the redaction instructions, or being overheard discussing sensitive matters);
    • Damaging the integrity and/or availability of the original unredacted files (e.g. overwriting them with the redacted versions);
  10. Use of redaction to conceal illegal or inappropriate activities (such as pedophilia - image redaction was ineffective in that particular case!);
  11. Various other risks (the risk analysis implied here is generic and not comprehensive: it does not necessarily reflect any specific situation).
The Public Health Agency of Canada redactors appear to have experienced risks #9.1, 9.3 and 8 on the list ... and possibly others too (e.g. #3: even if they had photocopied the paper-masked page and sent the photocopy, it’s quite possible the original text would have been discernible through the mask). 

Instead of merely being an intensely embarrassing privacy incident, this could literally have been a killer if, say, a security services informant, undercover agent or counter-terrorism operation had been accidentally unmasked.  Let’s hope the relevant parties are more competent than the agency in this case.

Friday 3 June 2016

Security awareness module on trust and ethics

June’s awareness module covers trust and ethics - no ordinary, run-of-the-mill awareness topic ... but then ours is no ordinary, run-of-the-mill awareness service! The module draws out important awareness messages that are directly relevant to information risk and security.

'Ethics' is a pervasive self-control underpinning many others. Ethical people think and behave honorably in ways generally considered correct and appropriate. They are open and honest, respectful of others and concerned about ‘doing the right thing’ and ‘doing things right’. In respect of information security, ethical behavior reinforces procedural controls – for instance, unethical people who disregard the principles and ignore policies and flaunt the procedures materially weaken the organization’s information security.

Trust and trustworthiness form the basis for collaborating with and depending on others, without the costs, disruption and aggravation implied by distrust and untrustworthiness. 

As well as being personal matters, trust and ethics also operate at the organizational level. Ethical, trustworthy organizations are held in high regard by others, while unethical ones are avoided.

Read on and get in touch to subscribe.  'Trust and ethics' is just one of twelve thought-provoking security awareness topics we're covering this year. 

Security innards


When someone foolishly lets Marketing loose on cybersecurity products, you end up with this kind of mishmash in your inbox:
"[Our product] can help you pinpoint the exact vulnerabilities that are currently active in your IT environment. Since not all vulnerabilities are threats to your organisation’s security, it’s important to focus on fixing the high-risk ones first—and fast. [Our product] gives you the intelligence to prioritise your remediation efforts to address the vulnerabilities that pose the highest risk of compromise."
It has been stuffed with a bunch of keywords to such an extent that it no longer makes sense, obscuring the value of the product. I believe there might actually be a decent new product lurking beneath all that tripe but, speaking personally, I'm not prepared to rummage through the entrails to find it.

Ho hum.