Think About Digital Ethics Within Continually Evolving Boundaries

When new technologies lift constraints, figuring out what you “can” do moves to determining what you “should” do.

The emerging discipline of digital ethics will probably always lack a set of universal rules, but the speed of technological advances, and an uneasy public, give business leaders no choice but to start defining their own positions in this area.

We’re close to the point of ‘algorithmic policing’ where machines protect us from ourselves with cars that won’t speed, and phones that won’t work while driving. But do we want our employer or insurer to know everything about us? Or should we be allowed the privacy to eat ‘badly’ and, to some extent, browse the Web unmonitored?

During the Gartner Business Intelligence & Analytics Summit, I explained that technology can realize both the dreams and nightmares of mankind: facial recognition that could thwart a terrorist attack, keep your home secure, or keep your children safe, could be used to profile and discriminate in a police state.

Most people take the position that technology itself is not a moral agent. Yet relinquishing control to machines raises questions of accountability when amoral drones take the decision to kill, or an autonomous car crashes.

Gartner analyst, Frank Buytendijk, explains the importance of digtal ethics during the Gartner Business Intelligence & Analytics Summit.

Draft rules around insurance for driverless cars are likely to state that whoever puts the car in motion – whether inside the vehicle or remotely – is ultimately responsible. How do you program a car to decide whether to swerve and kill a cat to avoid hitting a child? Is the programmer responsible for the loss of life, or the car manufacturer who employed them, the car’s owner, or even the passengers?

Once technology is released, the consequences assume a life of their own. Once an algorithm successfully replaces a human task, it can rapidly replace that task worldwide, for better or worse. To look at facial recognition again – racial profiling by one policeman is a problem, but as a global rule in a policing machine it’s terrifying.

Automated rules are repeatable, scalable, and efficient but the concern is they can cause the dramatic amplification of mistakes. At the same time, public mistrust in data security and how this information is used behind closed doors is also a fundamental problem for most organizations involved.

Get to Know Blockchain

Unlock the value for your business, now and tomorrow

Download eBook

Organizations must establish digital ethics because people will judge them through a moral lens. Yet this is like wading in moral quicksand as unforeseen issues emerge and reactive regulation struggles to keep up. Society isn’t yet ready for the possible outcome that, at some point, smart machines themselves will become responsible for their actions – but it’s the task of digital ethics to start the discussion.

Gartner clients can read more in the report Digital Ethics, or How to Not Mess Up With Technology by Frank Buytendijk.

Get Smarter

Gartner CIO Conferences

Learn about CIO leadership and how to lead your enterprise through its digital journey.

Explore Gartner Conferences

2019-2021 Emerging Technology Roadmap for Large Enterprises

We gathered expertise from IT professionals across 198 organizations to benchmark adoption stages and risk and value factors for 108 infrastructure and operations technologies for this year. The emerging technologies profiled are spread across six technology buckets: compute and storage, compute and storage (cloud), digital workplace, IT automation, network and security.

Read Free Gartner Research

Webinars

Get actionable advice in 60 minutes from the world's most respected experts. Keep pace with the latest issues that impact business.

Start Watching