What is responsible technology, anyway?
by Catherine Miller
Doteveryone’s founder Martha Lane Fox wrote a piece for the Guardian this week about making Britain a world leader in responsible technology. “In this brave new post-Brexit world,” she said, “let’s choose to be a country that believes technology in and of itself is not enough — that demands it be fair, ethical, and sustainable as well.”
It is a strong vision, and now we want to start making it a reality. The first step towards doing that is building practical, usable guidelines around ethical ways of creating and deploying digital — ones that can help drive change in ways which work for everyone.
Ethical standards on our products and services are not a new idea. As a consumer, I’m already offered an ‘ethical’ choice in so many areas of my life. When I drink coffee, I can choose to buy fair trade; I can do the same with everything from dolphin-free tuna to palm-oil-free biscuits to conflict-free diamonds. But in digital, so far there’s no handy, widely recognised stamp of approval that will allow me to see that the service or product I’m using has been created with ethical values at its heart. Why?
For starters, what do we mean by ethical? We can’t talk about ethics without talking about how complicated and interconnected issues are. (This isn’t a problem unique to digital, of course — my dolphin-free tuna may have been caught by a fisherman paid miserable wages in unsafe conditions.)
So creating an ethical framework in 2017 is a massive challenge. Where do we want to begin changing our digital products and services? Should we focus on stamping out racial discrimination on Airbnb and Uber? Or do we want to make sure that advertising revenues will not fund extremism? Or that content is not made up? Or that people who deliver the products we buy online get paid a decent wage? Which lever do we pull to drive the biggest change? And what are the consequences of choosing one issue over another?
Even once we have a clear definition of ethics in technology, actually creating change amongst our tech companies will be a challenge. Our most common digital products act as monopolies, so the idea of exerting consumer power over them through choice doesn’t work. I can’t choose between an unethical and ethical version of Facebook or Google.
And finally, even with only a limited understanding of agriculture and fishing, as a consumer I can have a rough idea of how my coffee ended up in my cup or a tuna sandwich on my plate. But the same is not true for knowing how an app arrives on the homescreen of my smartphone. Or indeed where the information I give the app goes afterwards.
But just because it’s difficult doesn’t mean change isn’t possible.
Huge amounts of good work is already going on to create tools to address the many different fields which ‘fairtech’ could touch on — from securing personal data, to making terms and conditions more comprehensible, to building products with a diverse workforce. Consumers International and IF have recently published a really valuable list of initiatives under way.
We don’t want to repeat the good work and thinking which has gone into the existing projects. But we do want to understand where they are most needed and also why they are not more widely known or implemented.
So we’re on the lookout for organisations — whether startups trying out a new product or established companies updating their digital offerings — who are willing to try it out. We want to accompany the development of a digital product and demonstrate a ‘real world’ implementation of ethical values. Get in touch if you want to be part of this work — or know someone who might be up for it.
We’ve started a few conversations with organisations and trade bodies who share our values, like Co-operatives UK, about how we might do some practical experiments with them. We’ve also been mapping the many different attempts to define ‘ethical’ online, and we would love to hear about the ones we haven’t found yet.
We know it takes time to drive change. Successful campaigns like fairtrade, or the living wage, or even charging for plastic bags took as much as a decade to develop from grassroots activism into established and regulated ways of doing things.
Digital products and services move quickly — at a speed many of us find dizzying. Certainly they move faster than our governments seem able or willing to regulate at.
But the vacuum of regulation is precisely why an ethical framework is so vital for the digital world. A lack of governance should not mean that anything goes.