Tuesday 24 May 2016

Fascinating insight from a graph

Long-time/long-suffering readers of this blog will know that I am distinctly cynical if not scathing about published surveys and studies in the information security realm, most exhibiting substantial biases, severe methodological flaws and statistical 'issues'. Most of them are, to be blunt, unscientific worthless junk, while - worse still - many I am convinced are conscious and deliberate attempts to mislead us, essentially marketing collateral, fluff and nonsense designed and intended to coerce us into believing conjecture rather than genuine attempts to gather and impart actual, genuine facts that we can interpret for ourselves.

Integrity is as rare as rocking-horse poo in this domain. 

Well imagine my surprise today to come across a well-written report on an excellent scientifically-designed and performed study - viz "The accountability gap: cybersecurity & building a culture of responsibility", a study sponsored by Tanium Inc. and Nasdaq Inc. and conducted by a research team from Goldsmiths - an historic institution originally founded in the nineeenth Century as the Technical and Recreative Institute for the Worshipful Company of Goldsmiths, one of the most powerful of London’s City Livery CompaniesThe Goldsmiths Institute mission was ‘the promotion of the individual skill, general knowledge, health and wellbeing of young men and women belonging to the industrial, working and poorer classes’. 

"Goldsmiths" (as it is known) is now a college within the University of London, based in Lewisham, a thriving multicultural borough South East of the City, coincidentally not far from where I used to work and live. I think it's fair to equate 'tradition' with 'experience', a wealth of culture, knowledge and expertise that transcends the ages.

I'm not going to attempt to summarize or comment on the entire study here. Instead I restrict my commentary to a single graph, screen-grabbed from the report out of context, hopefully to catch your imagination as it did mine:


That scatter-graph clearly demonstrates the relationship between 'awareness' (meaning the level of cybersecurity awareness determined by the study of over 1,500 qualified respondents - mostly CISOs and non-exec directors plus other senior managers at sizeable UK, US, Japanese, German and Nordic organizations with at least 500 employees) and 'readiness' (essentially, their state of preparedness to repulse and deal with cybersecurity incidents). It is so clear, in fact, that statistics such as correlation are of little value.

In simple terms, organizations that are aware are ready and face medium to low risks (of cybersecurity incidents) whereas those that are neither aware nor ready are highly vulnerable.

Even a correlation as strong and convincing as that does not formally prove a cause-effect relationship between the factors, but it certainly supports the possibility of a mechanistic linkage. It doesn't indicate whether cybersecurity awareness leads or lags readiness, for instance, but let's just say that I have my suspicions. In reality, it doesn't particularly matter.

Please download, read and mull-over the report.  You might learn a thing or two about cybersecurity, and hopefully you'll see what I mean when I contrast the Goldsmiths study with the gutter-tripe we are normally spoon-fed by a large army of marketers, press releases, journalists and social networking sites.

Take a long hard look at the methodology, especially Appendix B within which is the following frank admission:
"Initial examination of the responses showed that three of the Awareness questions were unsatisfactory statistically. (The three related problems were that they did not make a satisfactory contribution to reliability as measured by Cronbach’s alpha; they did not correlate in the expected direction with the other answers; and in at least one case, there was evidence that it meant diferent things to diferent respondents.) With these three questions removed, the Awareness and Readiness questions showed satisfactory reliability (as measured by Cronbach’s alpha)." 
Cronbach's (alpha) is a statistical measure using the correlation or covariance between factors across multiple tests to identify inconsistencies. The team used it to identify three questions whose results were inconsistent with the remainder. Furthermore, they used the test in part to exclude or ignore particular questions, thereby potentially warping the entire study since they did not (within the report) fully explain why nor how far those particular questions were out of line, other than an obtuse comment about differences of interpretation in at least one case. In scientific terms, their exclusion was a crucial decision. Without further information, it raises questions about the method, the data and hence the validity of the study. On the other hand, the study's authors 'fessed up, explaining the issue and in effect asking us to trust their judgement as the original researchers, immersed in the study and steeped in the traditions of Goldsmiths. The very fact that they openly disclosed this issue immediately sets them apart from most other studies that end up in the general media, as opposed to the peer-reviewed scientific journals where such honest disclosures are de rigeur.

I'd particularly like to congratulate Drs Chris Brauer, Jennifer Barth, Yael Gerson and team at Goldsmiths Institute of Management Studies, not just for that insightful graph but for a remarkable and yet modest, under-stated contribution to the field.  Long may your rocking horses continue defecating  :-)

Opportunistic security awareness

I just came across a little mobile-phone-shaped phone security awareness reminder card thing with a fairly direct in-yer-face message along the lines of "You left your phone unattended. I could have stolen it. Instead I simply accessed all the sensitive information on it because it was unlocked and unencrypted ..." The idea, I presume, is for someone (an information security/awareness professional, security guard etc.) to wander around looking for mobile phones left unattended/unguarded in public places, such as cafes or on public transport, leaving the card with the phone.

Personally, while it is tempting to go down that line, I'd be very wary of adopting such sinister overtones in an awareness message. The shock value might work for some recipients but it's basically a threat, a form of coercion with distinctly negative connotations about phone thieves AND careless phone owners, not to mention the security people who printed and left the card. Such negativism and nastiness could cause collateral damage to the security awareness program.

However, a bit of lateral thinking can easily turn the idea into something far more positive.

Get phone security reminder cards printed to a size and shape that people can slip into their mobile phone cases or wallets, and hand them out at awareness events, staff meetings, management meetings and so forth. Glossy labels would work for those who prefer ther phones naked. It's not hard to come up with suitable succinct, motivational messages: 'Protect personal information!  Use a PIN code, use encryption and remote wipe, don't trust caller ID, be discreet when using the phone in public, keep the security software updated, take backups of anything important, contact 555-SECURITY in case of loss or other concerns' ... that sort of thing. Our customers receive messages like those each month in the 'awareness activities' paper provided in the module, along with other creative security awareness suggestions. Awareness is what we do, after all.

If you have a budget for information security awareness*, you could even get mobile phone cases pre-printed with the awareness messages, preferably discreetly on the inside. Good quality phone cases would be sought-after corporate knicknacks, making them attractive prizes for awareness challenges, events, quizzes and competitions. Reward those people who show an interest in information security, and make an effort to be secure.  Be nice to them, support and encourage them. Send out positive vibes about the value of security.

If it works (which implies gathering and analyzing awareness metrics, by the way), it's not hard to extend the approach to pre-printed messages on corporate laptop cases, USB sticks, briefcases/portfolios, wallets, mouse mats, mobile mice, paper pads, sticky note stacks and so on. Just cast your eye over the average desk or cubicle to spot a plethora of places for those motivational messages. Don't forget the screens too: most IT systems can be configured with custom screensavers, pre- and post-login messages, error and warning messages, app screens etc. making them excellent choices for awareness purposes. Work with marketing to brand them with corporate logo if you like, plus health and safety, HR, compliance, risk and other functions who share our interest in awareness ... but please, please make the awareness content creative and positive.

Think 'inspiration' and 'motivation' not 'coercion'.


* PS  If you don't have the budget, I guess you have bigger fish to fry than unattended phones!  I'll expand on that depressingly common situation another time but for now take a look at security awareness on a shoestring budget.

Friday 13 May 2016

Friday poser

Is this a spoof PayPal email (most likely a phisher) or a genuine but misguided and inept attempt by PayPal to contact me? 

The message includes a log in link, pointing not to PayPal.com but to PayPal-Communication.com which could easily be a lookalike domain registered for the express purpose of fleecing naive recipients of their PayPal credentials, not to mention defrauding them of their funds. 

Further down, the message suggests that addressing me by my "given surname and given name" means I should trust the message - codswallop! For a start, the correct term is "given and family names", but of course that is readily available information: phishers can easily obtain lists of email addresses complete with given and family names, and lots more personal information. Google knows billions of them. As a means of authentication, it is worthless. 

So, what do you think: genuine (but inept) or fake?

Thursday 12 May 2016

Treating computer room fire risks

Following the Montreal Protocol agreement in the 1980’s, manufacture of the chlorofluorocarbon (CFC) extinguishant gases commonly known under the trade names “Halon” and "Freon" ceased before 1994. 

Although CFCs are highly effective extinguishants that work at relatively low concentrations (8%) with low toxicity and no nasty residues, they are potent greenhouse gases. In the upper atmosphere, CFCs are broken down by sunlight slowly (over decades), releasing reactive free chlorine which destroys ozone, subjecting the earth’s surface to increasing amounts of harmful UV radiation.


For a while, Halon made before 1994 was recycled from old stocks for specific applications by organizations that could afford it. Everything possible should be done to avoid releasing the gas unnecessarily – especially accidental triggering or inept servicing/installation/testing of suppression equipment.

Thursday 5 May 2016

Auditing IT facilities


I learnt how to audit IT facilities by doing it, initially under the wing of an experienced consultant, then with various professional colleagues but mostly under my own steam. I’ve done hundreds over the years, on many sizes of facility (ranging from SOHO and small office setups up to co location and single-company data centres the size of football pitches) and many types of organization across a variety of industries.  

The information risks and hence information security control requirements can vary markedly between facilities. A bank is in a rather different situation to, say, a power station, engineering company or hospital. Separate data centres within a given organization also vary e.g. HQ versus a branch or call centre. Data centres in different countries, cities/towns, even suburbs can vary e.g. compare Kinshasa to Kansas, Mogadishu to Manchester, Auckland North Shore to South Auckland. And they all vary over time, for example Brussels and Paris are different places today in terms of the terrorist threat than just a few years ago, and a data centre that was built last year is different to one put up when Turing was a lad. Despite all that, my approach is fundamentally the same, and I'm interested in similar types of concern.

Physically inspecting the facilities (usually with camera in hand) and talking to relevant people about what I see is the audit/review mechanism that works for me. I’ve yet to see a facility where a number of possible issues and concerns didn’t jump out at me just from a walkabout and chat. I simply keep my eyes and ears open, from the moment the facility comes into view, and ask plenty of dumb questions (“When was the last time you had a fire evacuation?  Was it a real fire or a practice? Who was in charge? How did it go? What did you learn? What has changed as a result? What else are you planning to do? …”).

The aspects I normally check include:
  • Environment/situation: any gross hazards such as earthquakes or volcanoes, tornados, flood plains & rivers nearby, slips and caves etc., neighboring facilities/areas, alternative facilities;
  • Architecture: physical construction, age & state of repair, visibility/conspicuousness, internal layout (e.g. separation from public, reception and delivery areas);
  • Physical access controls: policies & procedures (e.g. visitor authentication, tagging and baby-sitting, tailgating), locks (especially card access systems and skeleton keys), barriers, public and shared spaces, CCTV system (including camera quality, coverage, monitoring, recording and networking), guards (e.g. guard tours, contractors), PIR & other intruder alarms, turnstiles, layers with sterile areas/DMZs/moats, emergency egress, delivery areas, under-floor and over-ceiling voids, signage, rack security (including door locks and alarms);
  • Electrical power : incoming mains, switchgear & distribution, metering and alarms (local and remote), UPS, generators (and fuel!), capacity/sizing, redundancy, age, maintenance (including proper battery and generator testing), Emergency Power Off buttons;
  • Fire prevention: flammable materials and fire hazards nearby, smoke and heat detectors (including types and locations), alarms (local and remote), procedures, suppression systems and hand-held extinguishers, fire training, intumescent strips, maintenance & testing, suitability (e.g. high sensitivity aspirating systems), power and alarm interlocks, fire service or insurance company certification;
  • Air conditioning: type/suitability, capacity, redundancy, age, maintenance & cleaning, temperature monitoring & alarms (local and remote), response procedures;
  • Flood prevention: any signs of leaks, pipes in/near the facility, water detection & alarms, response procedures, emergency supplies of sheeting & buckets etc.;
  • Systems management: a whole detailed area of review goes on here concerning the way the IT, network, phone, security, building and other systems and services are managed and secured - I might go into that another time but I'll skip it for now;
  • Cabling: power and network cables and services – routes (e.g. ducts, conduit, separation), main and fallback routes (e.g. redundant ISPs, microwave plus fibre), protection, labeling & documentation, tidiness;
  • Other stuff: health and safety plus welfare (e.g. comfortable working conditions), change control procedures, documentation, labelling and control of servers, racks, switches etc., testing and exercising the procedures, any incidents, improvement initiatives and opportunities.
‘Professionalism’ or 'quality' is a thread throughout every review. I’m looking for clear evidence that each aspect of the facility was properly considered, specified, designed, installed/built, used, managed, monitored and maintained, is fit for purpose (which takes into account the type and size of facility and the broader business situation), and is generally ‘impressive’. Based on my experience and the answers to those dumb questions, it is usually obvious whether the people I meet have a clue or not, and whether they appreciate their own limitations. It's usually obvious within the first few minutes whether the facility is approaching world-class or falling well short.

It’s always good, by the way, to find excellent controls and people on hand – not least because I learn new stuff and hone my skills by doing these reviews, talking to the experts and reading up on relevant standards, building codes, local regulations, manufacturers’ advisories etc., oh and by constantly watching the news for stories about actual physical security incidents of all kinds. A stock of genuine anecdotes goes a long way towards persuading people that the risks are real.

I am a generalist, not an expert/specialist in each of those areas above. I believe I know enough and have sufficient experience to spot some issues, but I don’t consider myself competent to fully diagnose or fix them, and quite likely there are deeper issues/concerns that I don’t even notice. Where I find issues that concern me, my stock recommendation on anything complex is to take professional advice from competent experts in the relevant area as to whether the risks are truly significant and how best to address them. I generally stop short of giving specific advice, unless it is both obvious and simple - something along the lines of "Store all that paint, cardboard and plastic somewhere else, almost anywhere else, just not in the computer suite!"


PS  I tried but failed to persuade SC27 to beef-up ISO/IEC 27002's coverage of physical security. I still believe the controls listed above constitute 'baseline security controls', just common sense really but evidently not as common as I thought.