Better than ethics

Rachel Coldicutt

After four years of eating, sleeping and dreaming responsibly technology, Rachel Coldicutt gives her final talk as CEO of Doteveryone at 30 Chapel Street, the hub for social innovation in Bradford.

At the event on 27 November 2019, Rachel explains the importance of getting past the notion of “do no harm” and moving on to something better.

Today is my last-but-one day as CEO of Doteveryone, so you find me in a reflective mood. This is a less practical, and slightly more meandering, talk than I would usually give.

Sympathetic technology

I’m going to talk about how the proliferation of ethical codes for technology may actually be an exercise in defining the floor – what can be got away with, not what should be done – and how instead it’s vital to raise our sights and aim higher. To rebalance the impact of technology on people and the planet, it’s important to get past the notion of “do no harm” and move on to something better: something more sympathetic, longer-term, that puts the notion of digital disruption to bed.

In climate crisis, humanity is not enough

Although this event might be about humane tech, the compelling vision we need to get to next cannot be all about people. The planet is burning. As humans, we need to get over ourselves a little; there’s no point in putting ourselves first if we have nowhere to exist. So rather than humane technology, I’m going to talk about what sufficient and sympathetic technology might look like. 

For the last four years, I have been eating, sleeping and dreaming responsible technology – trying to find ways for technology to be better for more people, more of the time. Our work at Doteveryone has focussed on structural changes: on the system that is needed to make that work, rather than identifying the values that underpin that system. 

Technology is not ungovernable

And the reality is that this system is not particularly complex – it might feel unimaginable amid all the noise about technology, and when the most dominant voices say that innovation is inevitable, technology ungovernable – but it’s recognisable from the (just about) functioning democratic society we live in now. 

In truth, the digital society is not that different from normal society.

There is certainly a place for business in that system, but there should also be a space for a resilient civil society, for vital services delivered by the state, and democratic debate. Currently, a few businesses dominate not only the delivery and infrastructure of our digital lives, but the ways these businesses make technology, their measures of success, their credos of disruption have become the situation normal for the way digital and emerging technology are talked about, created and commissioned outside of business.

Shareholder value is not the only value

At the heart of any new vision for technology there needs to be an alternative to shareholder value and engagement metrics and a recognition that other kinds of value are important too. 

And something I’ve realised recently is that it doesn’t matter how good the system is, how useful the methods are, if it’s not driven by a compelling vision. A clear vision gives meaning – it allows some things to drop away and some others to become more important. It allows people to make choices. It sets an expectation of what good looks like that doesn’t rely on technical understanding.

Technology ethics has become a thriving field in the last few years. There is no shortage of codes of practice, guidelines, how-tos, and principles. We’ve been crowdsourcing a list at Doteveryone for a couple of years, and it’s now 27 pages long. 

None of these codes has really caught on or changed the world. A lot of them either sweat the details – setting out how-tos – or try to reinvent the wheel. But the reality is, outside of the technology industry, there is broad consensus on what is good for people: the UN Convention on Human Rights and the Sustainable Development Goals together set pretty compelling goals for humanity.

Are ethical codes the bottom of the barrel? 

But also, I have a suspicion that, particularly from a corporate perspective, a lot of attempts at setting ethical codes are actually a way of defining the bottom of the barrel – setting an agreement on the lowest it is possible to credibly go.

Pushing the limits of innovation in this way feels survivalist: it picks winners and losers from the outset, and describes a world in which technical possibility is more important than the rights of any individual. It’s a world in which facial recognition technology is being rolled out in public spaces, despite it being ten times less accurate at recognising women of colour than it is at recognising white men. A world in which cities would have to be reconfigured and homes knocked down to accommodate self-driving cars. In which the carbon footprint of the technology industry will have doubled between 2007 and 2020, our attention is being manipulated for advertising revenue, democracy is being undermined by Facebook quizzes, and Netflix wants to compete with your sleeping pattern and your personal relationships. I could go on listing the problems, but that’s the easy bit. It’s more difficult to think of answers. 

It pays to be confusing

One reason it’s so difficult to think of answers is because the digital world seems so all-consuming, so big, so insurmountable. And this is partly by design. In a system that is entirely driven by the creation of capital, it pays to confuse; it pays to make people feel overwhelmed, so they turn ever more to their phones for the answer.

We spent a lot of time at Doteveryone trying to quantify digital understanding, and I realise that some of that work might have slightly missed the point: in 2019, it is simply not possible to understand the entirety of the digital world. Pushing the duty to understand back onto any individual is, in itself, an act of confusion and disempowerment: sometimes it feels like the more detail I know, the less I understand. 

The GDPR doesn’t give us any extra hours in the day

Or, as Jeff Bezos might put it, there are too many complexifiers – intentional complexifiers – in our way. For instance, the GDPR might give us a right to have automated decisions explained to us, but it doesn’t give us the extra hours in the day and cognitive capacity necessary to make that possible.  

Take news, for instance. For the last few years, living in the UK, there has – I would say –  been too much news. It is possible to spend an hour furiously scrolling through Twitter while a parliamentary debate is happening, and somehow know less about it at the end than you knew at the beginning. The plentifulness of this information makes reality quite hard to comprehend; while disinformation is certainly a problem, so is the overwhelming quantity of information we’re presented with. How can anyone choose what to pay attention to, what is important? How do we anchor ourselves?

This reminds me of Donna Haraway’s extraordinary definition of our life on earth as “the thick now, the present which is not instantaneous, but extends into many kinds of time” (also, if you have a spare two hours, watch this.) There is an illusion that, as humans, we have some kind of supremacy – that we can know everything, be at the forefront of all realities – but in reality, the natural world is changing and multiplying and continuing all around us, all of the time. And the same can be said of data and the digital world, which is too big, too numerous, for most of us to comprehend. After all, life is busy. There is enough to think about.

We’re burning the planet and drowning in our own data

Looking for meaning in this thickness is difficult. It is too deep, too wide, it slips away too quickly. It is not something that most of us can conquer, and it’s not in the interest of capitalism for us to do that. And meanwhile, the complexification continues – the amount of data collected is going up and up and up, but very little of it is understood. Just as we’re burning the planet, we’re at risk of drowning ourselves in data – making new problems faster than we have time to solve them.

So just as we need to find new ways to live on planet earth, we need to look for different ways to co-exist with technology. We need to settle in, find out what a good life feels like, and feel empowered to claim that. We’re just over a decade into living with smartphones; it’s a tiny blip in time, there’s nothing inevitable or immutable about it. It’s our current reality, not our destiny.

Rather than assuming there is only a binary choice between technodeterminism and luddism, between survivalism and apathy, there is the opportunity of creating a moderate path – something more sympathetic, and more balanced, in which honouring others’ human rights, taking care of the planet, and fulfilling the Sustainable Development Goals comes as standard, in which data supports us and doesn’t overwhelm us, and in which business doesn’t always take the lead.

This is a world in which our digital lives aren’t lived entirely on autoplay – where the products and services we use collect just enough data to work well; where we can expect a future of just enough automation, not a world in which all work is taken away; where we buy and reuse just enough hardware; a world in which we can spend just enough time on our phones – where technology lifts us up, rather than oppresses us.

Knowing what good might look like is liberating. It helps us all to make better choices. And this is where the system comes back. We cannot wait for better to happen. If you have some power – as a citizen, consumer, friend, family member, or professionally as a maker, designer, or creator of any sort – then from time to time you get to make a choice. And, if you can, I’d encourage you to choose Just enough Internet.

Just enough internet sticker