+44 (0)20 7877 0060 contact@2-sec.com
Select Page

The Near Miss Mentality – No harm, no foul?

What is a Near Miss?

In a non-cyber security context:

Every single day countless accidents nearly happen, but often we don’t spare them a second thought. Because, simply put, they didn’t happen.

Some of those almost-accidents are minor things – one of my kids carelessly knocks a glass of red wine in a pale carpeted room. It teeters almost impossibly close to overbalancing, rocks back and forth a few times, then settles back down without spilling a drop. “Wow, that was close, please be more careful!” I might exclaim. The offending child probably hadn’t even noticed the trouble they were almost in. Five minutes later the near-event is forgotten by all.

Some are life threatening – a driver approaching a pedestrian crossing too fast hits the brakes, his tyres momentarily lose their grip on an aging road surface, before regaining a purchase just in time to bring the car safely to a halt. The driver’s heart is likely pounding adrenaline like a Hofesh Shechter performance (look them up, you won’t regret it), the pedestrians on the other hand might be totally oblivious to just how close they came to meeting a premature demise. How long will that near miss stay in the driver’s mind for? Will he alter his driving behaviour for life? For weeks? Days? At all?

Last week marked the anniversary of a big one

On 23 March 1989 an asteroid 1000 feet in diameter slipped silently past our planet, passing through the exact space that the Earth had occupied just 6 hours earlier. Had it hit us, the impact would have been immense; an explosion 12 times larger than the largest in recorded history. It would have killed millions, or more.

But the anniversary of us realising it had happened was not until today

It was 31 March 1989 before we became aware of just how close we had come to being hit by asteroid “1989 FC”. We never saw it coming.

Of course we learned from this

It soon became apparent that this kind of thing wasn’t even that uncommon. We started to pay closer attention to the paths of such objects. In fact, at one point it was estimated that there was a 1 in 300 chance that asteroid “1950 DA”, with a 0.6-mile diameter (3 times that of “1989 FC”), would collide with Earth on 16 March 2880. The probability has since been reduced to 1 in 20,000, and we hope to reduce this further with our next opportunity to pinpoint our knowledge of its orbit in 2032.

We can always learn from a near miss

It is all too common to treat situations as either black or white, as having an outcome that can be defined as either a success or a failure. A near miss might nearly have had a negative outcome, but even if this was only avoided due to shear dumb luck, by this approach it would still be considered a success. Clearly there are plenty of lessons to be learned from near misses – lessons that could well save us from high impact failures later down the line.

Back on topic then – What if the near miss was an attempted cyberattack?

One that almost resulted in a huge (and extremely costly) data breach? It failed (missed) and cost you nothing this time. But if you don’t intelligently learn from every failed attack, you can bet that the attacker will. And if your organisation did nothing the first time then there’s nothing to stop them trying again tomorrow, and the next day, and the next day, gaining knowledge of your operation with every attempt. Pretty soon it won’t be a near miss any more.

If a near miss is just a miss, then a near failure is just a failure

And they are just two ways to say the same thing.

A near miss is by no means an indicator that your defences are adequate and effective. A near miss is a warning that whatever your previous estimation of risk, it’s time to start worrying about what number to multiply that by. There’s no reason to imagine you were attacked at random, your perimeter was just tested and, if this was a targeted cyberattack, they’ll be back and they’ll hit harder and harder until they succeed, and you fail.

It is hard to heed a warning that you didn’t hear

It’s an accepted fact that even when it isn’t a near miss, when businesses have actually been breached, in many instances even then the breach is not even detected. Certainly many breaches go undetected for a significant period of time, and often when detection finally occurs it is not through IT systems, but by the appearance of stolen data in the public realm. Of course the fact that a situation exists certainly does not mean that it is OK!

A not-too-technical example: Many targeted attacks on big business begin with a campaign, often over many months, of targeting phishing emails to various employees known by the attackers to hold the privileged access levels that they require. We are constantly warned about phishing attacks, but if we knew our organisation was currently being targeted, we might be a little extra vigilant.

A little more technical: If you were to sit in a large organisation’s security operations centre and watch the array of outputs from the various monitoring systems, you’d find that there is a near constant white noise of ‘false positives’ being raised – potentially questionable transactions taking place, but mostly harmless and far too numerous to investigate. Actually, there’s something to be learned from every single one of these, even if it’s “now we know that this particular one really is normal!”.

I once attended a talk in which we were taken step-by-step through a penetration test, authorised at board level, of a large global financial organisation. The head of the external team brought in to perform the test was sat in that organisation’s security centre, with some invented backstory, just observing how the events were handled. He innocently asked, “What are those warnings?” and was told not to worry, that was normal. He knew that his team had already gained access to a meeting room, used a guest account on the PC in the room to access unsecured network drives, increased their access privileges multiple times, and by 10 am on the second day of the test they contacted him to say that there was nothing left to compromise – they had full control over every system within the organisation, including physical security, and could have exposed so much customer data that that FTSE 100 organisation would have collapsed overnight.

But don’t worry, those warnings are normal.

How can you possibly be expected to detect and respond to near misses, when breach detection remains such an issue?

Well, one certainly follows the other, and it’s time to start watching the paths of asteroids in cyber space.