Tuesday 12 March 2024

A nightmare on DR street


A provocative piece on LinkeDin by Brian Matsinger caught my beady eye and sparked my fertile imagination today. I'm presently busy amplifying the disaster recovery advice in NIS 2 for a client. When I say 'amplifying', I mean generating an entire awareness and training piece on the back of a single mention of 'disaster recovery' in all of NIS 2. Just the one. Blink and you'll miss it.

Oh boy.

Anyway, Brian points out that recovering from disasters caused by 'cyber attacks' requires a different DR approach than is usual for physical disasters such as storms, fires and floods. Traditional basic DR plans are pretty straightforward: essentially, the plans tell us to grab recent backups and pristine systems, restore the backups onto said systems, do a cursory check then release services to users. Job's a good 'un, off to the pub lads.

It doesn't take a genius to recognise the dependence on having sound backups and pristine systems even in that laughably simplistic scenario, implying risk. The cursory check is also a concern, particularly if there is a genuine possibility that the backups, servers or restoration processes didn't in fact go entirely to plan ... which seagues into considering DR in the aftermath of a typical malware or hacker attack, perhaps ransomware or some similar form of extortion. Oh oh, now the backups are untrustworthy at best, missing or unusable at worst, while 'pristine systems' begs questions too.

Leaving those concerns aside and blythely assuming the process goes well, the restored systems provide services and, just as we're about to head out for a swift ale, the restored systems once again collapse in a heap. Little did we know that the miscreants responsible for the original incident remained lurking in some dark forgotten corner of the network, or found their way back in, were invited in by the persistent malware within the restored backups, or these days simply observed from a distance as the autonomous malware went about its evil deeds.

Rather than elaborate further, I'll leave it there for now. By all means picture yourself in this situation and finish the story yourself, adding as many nightmare variants as you care to dream up. Mix in a sprinkling of coercion, equipment malfunction, accidents, fraud and incompetence, and you may not sleep so well tonight.

OK, now the real horror show begins. 

Imagine that cyber incident that caused the orginal disaster was not just some hoodied hacker intent on adding another yacht to his fleet, but a team of highly skilled and spooky adversaries trained and resourced by a noxious nation state to attack our critical national infrastructure, combining advanced persistent threats with dastardly techniques, perhaps even munitions. 

You may think I've skipped my anti-paranoia meds, again, but maybe you'll appreciate the extreme nature of the risks in such a scenario, leading to the utter and repeated failure of traditional backup-and-recover DR techniques. Even more advanced approaches such as distributed cloud-based automated failover have their limits, as well as increasing the complexity and attack surface. This is a target-rich environment, even if you are running low on ammo. And I haven't even mentioned supply chains yet, well apart from now.

^ All of that, and more besides, flows from a quick read of an inspiring article on LinkeDin and a single (count it with me: one <stop>) lonely mention of disaster recovery in all 73 pages of NIS 2.

Oh boy oh boy.

No comments:

Post a Comment

The floor is yours ...