/mar 6, 2019

In a software-driven world, who is responsible for the risks?

By Jessica Lavery

The power of software to improve our lives and our world is almost limitless. Consequently, those creating software are wielding a power that demands a new level of responsibility.

When I think about how fast the world is changing, I wonder how our ancestors must have felt at the dawn of past industrial revolutions. Everything changed – the way we made, shipped, and sold goods evolved, and daily schedules and lives changed as people moved to cities to escape subsistence farming and find work in factories and mills. All of this change was fueled by new technologies and innovations. While many of these changes were positive, there were risks and costs, such as increased injuries, rising wealth inequality, and, as urbanization took hold, an increased spread of disease. It became the responsibility of factory workers, and in some cases the government, to address these concerns in order for our economy and society to flourish and grow.

We are in the dawn of the fourth industrial revolution, where software will not only power our lives, but is also created by organizations to change the world in remarkable and sometimes unimaginable ways. We are already seeing innovations in software to solve some of modern society’s biggest challenges. There is software to help farmers determine the exact amount of water to use to hydrate their fields, so they don’t waste such a precious commodity. There is software to help diagnose disease, monitor vital health information, and even treat diseases.

This software is not just powering our world, it’s changing our world. Thus, those who create software have an increased level of power in our society – and as the Spiderman comics say, “with great power comes great responsibility.”  

Consider that the average car today has approximately 100 million lines of code. A good portion of this code goes into operating innovations that make the car more automated. The ethical implications of creating this code for vehicles is much more complex. For example, developers creating code for a self-driving car must consider how the technology will respond if the car is placed in a situation where it has to choose between hitting another car, or, worse, a pedestrian or biker. There is no right answer, and typically the driver would make this split decision based on instinct, reaction time, and cultural priorities. However, when a computer makes the decision, we are really asking the developer to decide, placing more responsibility with the developer.

As software becomes more ingrained into our lives, we are placing an increased responsibility on the shoulders of developers to make sure that software is functional and safe – both safe in terms of how it operates, but also how secure it is. In a world where software is used to treat patients, and solve important human issues, a breach in the digital world can have a tragic effect on the physical world. What a great responsibility developers have to code securely. We are putting our trust in their typing hands – trust that they will create great code, and that they will create code that does no harm, and doesn’t allow bad actors to use that software to harm. I’m not sure that’s what most programmers were signing up for when they decided to take that first computer science course. But it’s our current reality, and ultimately it’s the responsibility of all who interact with software – whether purchasing, using, or coding – to insist that quality software = secure software.

Related Posts

By Jessica Lavery

Jessica is part of the content team at Veracode. In this role she strives to create and promote content that will engage, educate and inspire security professionals around the topic of application security. Jessica’s involvement with the security industry goes back more than a decade at companies like Astaro, and Sophos where she held roles in corporate communication and marketing.