Computers need software. Software has bugs. There is no such thing as bug-free software. So how can more software – which means more bugs – improve security?
It’s scary actually. Think about it: Just about anything we buy and use and are depending on these days have software. Lightbulbs, batteries, bikes, pacemakers, toothbrushes, keys, phones, TVs, doorlocks, trains, airplanes – even shoes and clothes are getting there, the list goes on and on. Small computers – many known as IoT-devices – running specialized software all over the place. Bugs (and risks) included.
Mentioning this fact doesn’t even surprise most people. Bugs? OK, get an update and move on. That’s how used we’ve become to the errors, vulnerabilities, things that worked yesterday but not today. Well, pull the plug, count to 10, try again. We don’t even call customer support anymore. Not because of the wait (it’s exasperating, see The Customer Service Disaster) but because we know what they’re going to say. Pull the plug, wait … etc.
The interesting thing isn’t the number of problems but our lenient and accepting attitude. Even very expensive, so called high quality, products have problems and we just shrug, pull the plug, try again, waste time and go on.
Admittedly, the situation has improved. Not necessarily the number of bugs or problems, but the ease of getting updates. They’re more often than not automatic – which has an interesting twist: Just as you had gotten used to a work-around, the problem is not only gone, it’s replaced by a new one.
I’m digressing – and I’m being sarcastic. You may be thinking ‘it cannot be that bad – after all, cars don’t stop and airplanes don’t fall down all the time’. Well, they do but not all the time. The 737 MAX crashes and subsequent grounding a few years ago was a software problem. Admittedly not buggy software but buggy algorithms, but still. There are small and medium problems with most of these products all the time. It’s a calculated risk – but there is one more thing, and this is my point: In many, hopefully most cases where life is at stake, there are multiple systems covering the same task.
Remember the Boeing 747? Of course, it was in the news just a few weeks ago because the ‘last of a breed’ was delivered. An incredible story that started in 1968, actually 1963. 1968 was the first year of delivery, and here’s the thing: No microprocessors, no digital sensors, very little digital-anything, just about everything was mechanical, electrical, analog. Shafts, levers, servos, switches and relays etc. – and a lot of wiring. Stuff that moves, stuff that wear out, stuff that breaks. Still – extremely reliable. I suspect the early models were as reliable as the later ones with more electronics and digital components (although more expensive to build and maintain). Why? Because there were few if any single points of failure. Backup systems, multiple systems for critical tasks ready to ‘step in’ in case of failure – plus lots of maintenance. Not rocket science at all, more like common sense and clever engineering.
Back to the digital world and software, this is exactly why more software (with more bugs) can improve reliability and security. Duplicated (or triplicated) functions, functions that monitor each other, anomaly detection and much more. Smartness enabled by the very presence of (small) computers and software.
Now, if such robustness is possible, why do our software-based products fail or act up all the time? Why do we have all these bug-related problems – in everyday products and mission/life critical systems? Because vendors are sloppy, developers are rushing (or being rushed), duplicated functions are conveniently or intentionally ‘forgotten’, or – and this is particularly interesting – duplicated functions are identical, have the same bugs and fail concurrently.
An important difference between old mechanical systems and new (digital) variants: While the mechanical systems, say rudder control or flap/slat shafts, could be almost literally duplicated, software should not be. For the simple reason that duplication also duplicates bugs. So if one fails, the other(s) will likely fail in the same way for the same reason(s) at the same time. You may argue that the same applies to physical mechanisms, but they fail differently and are less prone to cascading failures.
For this very reason, smart companies never rely on equipment and software from one single vendor. Security and network equipment in particular should come from different vendors in order to avoid the threat of ‘monocultures’ – one single bug affecting thousands of units, stopping or exposing entire organizations, even countries.
I digress, but this is how small bugs, omissions, even misunderstandings in software cascade into small, sometimes big catastrophes. And this is why more software – if done right – can create extremely resilient, secure, reliable – in some cases even self-repairing – systems. Think Perseverance on Mars and the Ingenuity helicopter. Of course they have software bugs. Of course there is extreme reliability. It’s demonstrably possible. (See also Mars is Closer, Survival is not …)
Back to the key question: Why are we – you and me – experiencing all these bugs, vulnerabilities, weaknesses, crashes etc. – daily? Is it bad quality? You bet it is. Is it getting better? No, it’s getting worse – for the simple reason that demand for software outstrips production capacity. Too few (good) programmers, too many projects, sloppy quality control and a lenient market.
Still there are some good news. First the obvious: High quality is demonstrably possible but carries high(er) prices. Secondly – the situation will be improving shortly. New ‘smart systems’ (also known as AI) based on technology similar to ChatGPT, will soon be churning out secure, reliable, high quality software at high speed. Not 5 years from now, but starting this year. In fact it has already started – small scale. It’s not going to be bug free, but it’s going to be a huge improvement (take a look at this article on Medium.com for some inspiration).
It’s the best software news in decades. Will they deliver improved security? You bet they will. What will happen to the programmers? The good ones will be teaching the ‘machines’. The not so good ones will be doing something else.