Our digital world is software, created and built by software engineers. If the software stops, the world stops. That’s software power. Can they take it down – or abuse it? Of course. It happens all the time … and it’s getting worse.
Major strikes remind us of the obvious: No actors and writers, no movies. No dock workers, no loading and unloading. No bus drivers … – and so on. Their importance in our lives vary, but none of them stop the world. Just like ‘no software engineers, no software’, right? Wrong. Software runs the world in ways never seen before. Stop key software products and the world stops. Almost like sucking the air out of the room – or the atmosphere. Everything stops. That’s power.
Of course it’s not that simple, but it’s an important line of thought. Our digital world is enabled by and totally dependent on software – and thus on software engineers and their ‘professional relatives’, data scientists and a few others: Architects, developers, designers, testers, cleaners, etc. They literally keep the world running – and moving ahead. They are the enablers of and custodians of AI/LLMs etc. And they can stop it – just as literally. Maybe not one or two but it doesn’t take more than a handful of collaborating specialists to stop a country – or even a continent.
That’s how exposed we are – and that’s the power they have. Of course there are mechanisms in place to ensure it cannot happen. They are good and constantly improving but they’re not perfect. We’ve seen the Internet go down in parts of the world because of accidental misconfigurations. We’ve seen ‘terrorists’ lock down entire cities. And we’re seeing successful ransomware attacks all over the world almost daily – enabled by software weaknesses, bad design and sloppiness. By software specialists and -companies.
That’s exposure on the one hand and power on the other. Deep knowledge about how our digital world works at the lower levels, about the technologies and tools that keep our systems going, is an extremely powerful and valuable possession.
Having such power is also tempting. Small and untraceable changes here and there, deep down in a system, may deliver dramatic results higher up – and never be caught. Which means that the moral compass of individuals is more important than ever. It’s also under heavy attack – constantly.
Although diving down into risk, exposure, digital warfare and the ongoing work to make our digital world safer and more resilient would be intriguing, it’s also being widely covered in the trade press and expert fora around the world. For example, if you’re interested in the current state of Internet resilience, check out the 70 page report Security of the Internet’s Routing Infrastructure which was published by a group of experts last year. Or head into the fascinating book You Are Not Expected to Understand This – you don’t have to be a software engineer to enjoy it. A temptation from the back cover:
Few of us give much thought to computer code and how it comes to be. The very word “code” makes it sound immutable or even inevitable. “You Are Not Expected to Understand This” demonstrates that, far from being preordained, computer code is the result of very human decisions, ones we all live with when we use social media, take photos, drive our cars, and engage in a host of other activities.
Instead I’d like to delve into a very different side of the same picture, brought up on the Substack channel The Pragmatic Engineer the other day: Software power as an enabler to big time fraud. All – and it’s worth repeating: all – cases we’ve seen of scams and fraud in the crypto arena (and FTX in particular, one of the examples in the article) have been enabled by software engineers. In some cases by creating false pretenses, in other cases creating or manipulating false data. Changes that are (relatively) easy to make for someone who knows the systems and the technology. And in most cases hard to discover – until someone very competent digs deep under the hood.
This is ‘software power’ – whether it’s about data, back doors, spyware, intentional malfunction, presentation or something else digital. It’s also – as alluded to in the title of the Substack article – an exposed position: Asked to do something illegal at work? Here’s what these software engineers did.
Interesting (and recommended) reading and a very timely reminder about a whole new spectrum of exposures and vulnerabilities in the digital world. Professional vulnerabilities that are easy to exploit if something as old-fashioned as our moral compass doesn’t work.
Possessing software power is a great feeling, a great value and a safe future – if and only if we always remember this obvious but important statement from the article:
If you take one lesson from this, it’s that you can always say no.