Executive Summary

The public is once again recalibrating its relationship with technology. The pandemic lockdown has accelerated even further the already dizzying speed of technological change: suddenly the office has become Zoom, the classroom Google and the theatre YouTube.

The transformations wrought in this period will be lasting. The outcome of this period of increased tech dependence must be one where technology serves people, communities and lanet.

Doteveryone fights for better tech, for everyone. To achieve this it’s vital to listen to – and respect – the views of the public. This report puts the people who are experiencing this tremendous transformation front and centre.

Based on our groundbreaking 2018 research, we ran a nationally representative survey just before lockdown and focus groups shortly after it began, benchmarking the public’s appetite, understanding and tolerance towards the impacts of tech on their lives.   

This year’s research finds people continue to feel the internet is better for them as individuals than for society as a whole. But the benefits are not evenly shared: the rich are more positive about tech than the poor, risking the creation of a new class of the ‘tech left-behind’. And it finds most people think the industry is under-regulated. They look to government and independent regulators to shape the impacts of technology on people and society.

It finds that although people’s digital understanding has grown, that’s not helping them to shape their online experiences in line with their own wishes. They still struggle to get information about the issues that matter and to choose services that match their preferences.

And it finds people often don’t know where to turn when things go wrong. Even if they do report problems, they often don’t get any answers. They mistrust tech companies’ motives, feel powerless to influence what they do, and are resigned to services where harmful experiences are perceived to be part of the everyday. 

The current societal shift is an opportunity to shape a fairer future where technology works for more people, more of the time. Our practical recommendations to government and industry provide clear steps to make that happen.

Key findings

The vast majority of people think the internet has improved their lives but are less convinced it’s been good for society as a whole. 81% say the internet has made life a lot or a little better for ‘people like me’ while 58% say it has had a very positive or fairly positive impact on society overall. Only half feel optimistic about how technology will affect their lives (53%) and society (50%) in the future.

58% of the public say that the tech sector is regulated too little. They identify government (53%) and independent regulators (48%) as having most responsibility for directing the impacts of technology on people and society.

People are taking a range of measures online that stem from their digital understanding. Most people have checked their privacy settings (73%), looked for news outside their filter bubble (67%) or used an ad blocker (56%) but people tend to take these actions only occasionally. 

Nearly half (47%) feel they have no choice but to sign up to services despite concerns and 45% feel there’s no point reading terms and conditions because companies will do what they want anyway.

Over a quarter of the public (26%) say they’ve reported experiencing a problem online but that nothing happened as a result. More than half would like more places to seek help (55%) and a more straightforward procedure for reporting tech companies (53%).

Only 19% believe tech companies are designing their products and services with their best interests in mind. Half (50%) believe it’s ‘part and parcel’ of being online that people will try to cheat or harm them in some way.

Recommendations

We recommend the creation of an independent body, the Office for Responsible Technology, to lead a concerted, coordinated and urgent effort to create a regulatory landscape fit for the digital age and ensure the benefits of technology are evenly shared in a post-pandemic world.

We recommend all tech companies implement trustworthy, transparent design patterns that show how services work and give people meaningful control over how they operate. The Competition and Markets Authority, in coordination with the Information Commissioner’s Office, should set and enforce best practice for understandability, transparency and meaningful choice for the platforms where people spend most of their time online.

We recommend the Government should base its forthcoming media literacy strategy around new models of public empowerment for the digital age that:

  • Meet people where they are, with opportunities to act embedded into products and services
  • Provide information that’s specific to the issue and tailored to the individual’s capability and mindset
  • Enhance rather than detract from current online experiences and create feedback about the impact of any action, creating the motivation to act.

We recommend all tech companies create accessible and straightforward ways for people to report concerns and provide clear information about the actions they take as a result. And we recommend the incoming online harms regulator provide robust oversight of companies’ complaints processes founded on seven principles of better redress in thedigital age:

  1. Design that’s as good as the rest of the service
  2. Signposting at the point-of-use
  3. Simple, short, straightforward processes
  4. Feedback at every step
  5. Navigating complexity
  6. Auditability and openness
  7. Proportionality

We recommend that digitally-capable super complainants should be empowered to demand collective redress from technology-driven harms on the public’s behalf and to channel unresolved disputes between individuals and companies. And we call on the Government to support coordination for civil society organisations helping people to address the impacts of technology-driven harms on their lives.

Unlocking the benefits of technology for everyone

The UK public sees greater benefit from digital technologies for themselves as individuals than they do for society as a whole. As they look to the future, people become notably less optimistic and most think the tech industry is under regulated.

Read More

Enjoying the benefits of tech

The vast majority of people think the internet has improved their lives: 81% say the internet has made life a lot or a little better for ‘people like me’. But they are less convinced it’s been good for society as a whole: 58% say it has had a very positive or fairly positive impact on society overall.

Impact of the internet on individuals and society

In discussions held shortly after the start of the pandemic lockdown, people were particularly grateful for their ability to continue to work, maintain friendships and access information thanks to technology.

However, there’s a significant drop in the strength of people’s enthusiasm over the past two years with 38% saying the internet has made life a lot better for people like them, compared to 50% in 2018. 

And in conversation people expressed ambivalence about the trade-offs that technologies entail for them in their lives.

It makes people ignorant, including myself. People are too busy with their heads buried in their phones. I’m just as guilty as anybody. Get up in the morning, check Facebook, check Facebook throughout the day. It’s addictive and people can’t leave it alone.

Attitudes towards the impact on society are largely unchanged from the previous research with only a slight increase in the number of people who say it’s had a very positive impact and slight decrease in those who say it’s had a negative impact.

Different groups feel greater benefits from technology than others, with the over 65s most likely to say the internet’s been good for them and for society overall.

People on higher incomes are significantly more likely to say the internet has made life better for them (85%) than those who are less well off (75%). They are also more likely to agree that the internet has had a positive impact on society overall (62%) compared with just over half of those on lower incomes (52%).

Frequent users are also significantly more positive –  87% believe the internet has made life a lot or a little better for them compared to 78% of light users and 67% of frequent users who see a positive impact on society compared to 55% of low users.

This gap in the perceived benefits between rich and poor, the digitally confident and those less adept with tech is significant and has widened since 2018. In this, technology risks exacerbating existing inequalities and creating a new group of the ‘tech’ left behind. This is especially problematic at a time of increasing dependence on the internet due to lockdown and increases the vulnerability of those with least resilience to withstand the crisis.

Future caution

Looking towards the future, people’s views are more tempered. Although more than three quarters (81%) say the internet has made life better for them, only just over half (53%) are optimistic about how technology will affect their lives in the future. Women are less optimistic (48%) than men (58%) and over 45s were less optimistic (48%) than under 45s (59%).

Again, there was greater optimism about the future among frequent users of technology (75%) and the wealthier (57%) than among less frequent users (45%) and the less well off (48%).

Likewise, while 58% saw a positive impact from the internet on society overall in the past five years, only 50% are optimistic about technology’s future impact on society. Again men (56%) and those under 45 (55%) are significantly more likely to be optimistic. And here too the wealthy (54%) and tech savvy (74%) were more positive than others.

These findings suggest that a significant portion of the public does not see innovation as a good thing in and of itself. Tech for tech’s sake is unlikely to wash. The public will need to be persuaded that technological change is in their and society’s interests.

There’s a personal thing where, on a day to day basis, these things are so useful – the speed at which we can order things, we can talk to people, we don’t have to leave the house. Brilliant! But I think there is a big picture in what is it doing to society and where is it going to take us? Because ultimately, if we have machines that do everything, we don’t even need to get out of bed in the morning, we don’t have a purpose anymore.

Getting the balance right

It’s challenging but necessary to make sure the benefits of technologies are fairly distributed. This requires checks and balances on how technologies are made and used so that they work for more people, more of the time.

But most people think that’s not happening: 58% of the public say that the tech sector is regulated too little, with 23% saying it’s regulated about the right amount and only 2% that it’s regulated too much.

If there is regulation we don’t know about it. It would be good to have simple clear overarching national regulation for all of it. We need a regulatory system or a department to lay down the ABCs of do’s and don’ts.

In any other industry you’d want a governmental body to take control. The problem is, the way that technology seems to be going is that one or two or three companies seem to be completely dominant and so it’s a weird scenario in which these companies almost appear quite trustworthy, just because of the size of them. Most people are happy for Google and Apple to take control and look after it, but I think that the government should continue to play an important role.

And most people are prepared to accept the potential limitations this might entail.

When asked to choose between two sets of trade-offs, a significant majority opted for greater restrictions on content and consumer choice in exchange for regulation.

Accepting the trade-offs that come with regulation

When asked who should take the most responsibility for the impact technology has on people and society, half believe it should be the government (53%) and independent regulators (48%).

In practice though, they see regulators (43%) and tech company leaders themselves (41%) as most able to influence the effects of technology with only a third (35%) seeing government as one of the groups with most ability to influence outcomes.

And people acknowledge the challenge of regulating tech companies whose scale and pace vastly outweigh government’s capabilities.

Technology is getting more complicated as the years go by. I think a special department  should be set up that can try and police it with experts that understand it. It’s a massive undertaking, how it will be done, I have no idea.

It should be taken out of the hands, as much as possible, from the companies that are actually making the technology, so they have to abide by the same sort of rules, and the same sort of sets of law. I guess with technology though, it’s a hard one because these companies are global so what government is responsible? Does the UK take its own approach?

To create an equitable digital society, technology must work for the benefit of individuals, communities and society as a whole. It’s the job of the UK’s democratic institutions to manage that through regulation. This research shows the public doesn’t feel the government is currently doing enough and there’s a clear demand for independent oversight. 

Since the publication of our last survey, regulatory initiatives have mushroomed – including proposed Online Harms legislation, the Cairncross and Furman reviews, the online markets and digital advertising market study by the Competition and Markets Authority and the publication of the first recommendations from the Centre for Data Ethics and Innovation around online targeting. But so far there’s been little tangible change in practice.

The coronavirus response has understandably put much of the policy agenda on hold. But the crisis has accelerated the adoption of digital technologies in people’s personal lives and across the public and private sector and will create long term change in how society functions. It’s vital that this does not take place in a regulatory vacuum.

Our earlier report, Regulating for Responsible Technology, called for a systemic approach to accountability that will promote a fair, inclusive and thriving democratic society. This is more necessary now than ever.

We recommend the creation of an independent body, the Office for Responsible Technology, to lead a concerted, coordinated and urgent effort to create a regulatory landscape fit for the digital age and ensure the benefits of technology are evenly shared in a post-pandemic world.

As we set out in that report, this body will empower regulators by closing gaps in regulation and supporting them with expertise and foresight; inform the public and policymakers with an evidence base about the benefits and drawbacks of technologies; and support the public to find redress from technology-driven harms.

 

Hide

Closing the understanding gap

People’s awareness of the data collection and use that underpins many technologies has increased. But tech companies are still not giving people the information or the choices they need to be able to use services in line with their own preferences.

Read More

Coping not coding

No one can – nor should they have to – comprehend the workings of each and every digital interaction they encounter in their lives. The complexity is mind boggling and ever increasing. But there are underpinning dynamics to technologies – the exchange of data for services – which are important to grasp. 

Digital understanding is not about being able to code, it’s about being able to cope; it’s about adapting to, questioning and shaping the way technologies are changing the world.

Compared to 2018 we find that many of the gaps in digital understanding we identified are easing. But it’s a never ending game of catch-up. New innovations quickly become widespread: 63% have used biometric recognition, 54% have used smart speakers, and 40% have used wearable devices. People are struggling to keep up with the way these work.

Because of this changing landscape, we have not directly compared questions and blindspots from 2018 but instead have focused on, and dug a little deeper into, the public’s understanding of how data is collected and used.

Digital Understanding

How your personal information is collected

The public is now more aware that organisations collect information that they actively share as they use services, for example through search and purchases (85%), using social media (71%) or filling in forms (68%). Two years ago only around two-thirds realised search and purchase information was collected.

Awareness is slightly lower around the data people share passively, either through smart devices in the home (60%) or when other people share information about them such as on social media (57%). But in 2018 only 17% were aware that information others shared about them was collected.

People’s understanding of newer technologies is more limited. Only 38% believe that biometric information such as fingerprints, face or voice data is collected, 37% believe  that finger movements on a screen is collected and only 18% believe that data about their performance at work is collected.

Understanding is significantly higher across most of these methods among the better off. But there is only slightly higher awareness of some practices among frequent users of technologies.

How your personal information is used

There is now widespread awareness that personal information is used to target advertising (79%), personalise information (75%) and to sell on to third parties (72%). This is a significant increase over our 2018 findings.

But, this understanding remains shallow. For example, although three-quarters say they believe data is collected to personalise services, 41% agree that ‘when I search for something on most search engines I will see the same search results as other people’, with only 24% disagreeing. 

People are less inclined to believe that organisations use data for their users’ benefit. Less than half of people (48%) think tech companies use the information they gather to improve their experience of apps and websites and only 14% believe it’s used to help protect them from scams.

Awareness of how data relates to tech companies’ business models has not increased to the same extent. Just under two-thirds (62%) of people think social media is funded through advertising that’s based either on relevance or personalised targeting. Similar numbers think this is the business model for search engines (61% for relevant and 59% for targeted advertising). This is largely unchanged from our 2018 findings.

Higher numbers now believe tech companies sell data on: 51% now say search and social media do this (up from 43% and 38% respectively) and 40% say free to use apps such as games or route planners do this (up from 30%). But a significant percentage (between 14% and 20% depending on the kind of service) still do not know how tech companies make money.

People’s confidence in their ability to understand their digital world is growing.

In the two years since our last survey tech issues have gone mainstream. The introduction of GDPR combined with efforts by some companies to better explain their service to users have helped surface technology’s workings.

I think possibly because of things like MoneySavingExpert, and articles in the papers or on the telly, that we will think twice before just going into a site or clicking ‘ok’ or accepting… when we all first got involved we were perhaps all a bit too click happy without thinking about the consequences.

I’ve noticed just in recent times, I think it must have been EU law, or GDPR law… a lot of pop-ups that come up on the screen more than they used to, explaining, agreeing to allowing cookies, allowing certain information to be taken and bullet-pointing out what information they’re going to take from you.

As we’re getting smarter, I think it’s in a technology company’s best interest to try and gain a bit of trust now. I do think they are trying harder to do that. They’re maybe being a wee bit more open. It’s easier now to opt out of things.

But often people still end up guessing at what’s going on.

Maybe a smart speaker picks a word and thinks: “hang on a minute, this user is thinking about, for example, dog food”, and then the next time I open the web browser, and the first thing you see is dog food on your YouTube or Instagram feed.

It’s not that easy to know what’s being collected because every time you’re on the phone you need to accept cookies… So they could be taking information and you don’t know what information they have about you on their database.

You could be looking at something on your PC and then using your phone later at night, and what you’ve been looking at on the PC during the day automatically appears on the phone at night and you think: how did that happen?…it should definitely be more transparent to the end user as to how they do it.

Information gap

A significant gap remains between what people say is important to them and their ability to get information about those issues.

Most people can find out about the reliability, compatibility and customer satisfaction of a service before they use it (perhaps unsurprising as this information is readily provided on app stores). But when it comes to the security, use and control of data and broader issues of responsibility less than a quarter can find out what they want to know before they use a service.

The information people want vs the information people find

This information gap can lead to frustration and a tendency to think the worst of tech companies, underlined by the drip feed of tech scandals that have lifted the lid on tech practices.

They’re not bothered about your best interests – it’s all about selling. I probably wouldn’t have said it 5 years ago because social media and this snooping on your data thing wasn’t as bad back then, they’ve got worse over time. I have control over turning the device I’m using on or off and that’s about it.

From understanding to action

Knowing about an issue only helps if people can act on that information.

We find that people do take a range of measures that stem from their digital understanding: most people have checked their privacy settings (73%), looked for news outside their filter bubble (67%) or used an ad blocker (56%). Just under half have used incognito browsing (47%) or deliberately ‘dirtied’ their data (46%) to prevent profiling although few have used explicitly privacy preserving services such as DuckDuckGo.

Taking action to shape online experiences

But people tend to only do these things occasionally.

I absolutely hate targeted advertising because it’s like big brother is watching over my shoulder. If I’m looking for medication for my health, or something I don’t want to disclose with anyone else, and then suddenly it appears on my Safari or Google web browser… It’s not on a regular basis but every now and then I just delete my web browsing history, just for peace of mind, I don’t know whether that actually helps or not.

In our previous research, Engaging the Public with Responsible Technology, we found that there were three elements to empowering the public. Digital understanding – or capability – is just one of them. People also need the opportunity and the motivation to take action on that understanding.

These are often missing when using the internet. For example, the prevalence of defaults along with the disproportionate effort and unclear outcomes of adjusting settings discourages people from making changes.

It’s difficult, you want to control it but you’re almost out of control. You’ve got to do the little tweaks but I don’t really think they make much of a difference if I’m honest.

This lack of opportunity to shape their experience means many people end up using services despite misgivings about what that might entail. This is illustrated by people’s attitudes to terms and conditions. Nearly half (47%) feel they have no choice but to sign up to services despite concerns and 45% feel there’s no point reading terms and conditions because companies will do what they want anyway. These attitudes are largely unchanged from our 2018 survey.

Engaging with Terms and Conditions

It’s welcome that digital understanding has improved since our 2018 survey. But there’s still a long way to go. Obfuscation does not serve the tech industry well. If tech companies want to build and maintain trust with their users they must give people both information and opportunities to shape their digital experiences to their own needs and preferences. 

This should be a benefit not a burden for companies – our survey of UK tech workers last year found a clear appetite to work responsibly: two-thirds would like more opportunities to assess the impacts of their products. Companies can start to change their services now, for example, drawing on the existing design patterns made available by organisations like Projects by If.

We recommend all tech companies implement trustworthy, transparent design patterns that show how services work and give people meaningful control over how they operate.

But, as the Competition and Markets Authority has pointed out, the lack of meaningful controls is not only bad for individual consumers, it can also inhibit competition. And so good practice must be enforced for the largest services.

We recommend the Competition and Markets Authority, in coordination with the Information Commissioner’s Office, set and enforce new design standards for understandability, transparency and meaningful choice. These should be applied in the first instance to companies with strategic market status – the platforms where people spend most of their time online.

Doteveryone is currently working with the Behavioural Insights team and Centre for Data Ethics and Innovation to explore how best to design opportunities that allow people to make choices that align with their own preferences for data use and personalisation. These will be published later in 2020.

These changes in platform design must be supported by efforts to inform and empower the public. Our report, Engaging the Public with Responsible Technology, describes new models of public empowerment that go with the grain of the digital experience. These research findings underline the need to implement these approaches.

We recommend the Government should base its forthcoming media literacy strategy around public empowerment that:

  1. Meets people where they are, with opportunities to act embedded into products and ervices
  2. Provides information that’s specific to the issue and tailored to the individual’s capability and mindset
  3. Enhances rather than detracts from current online experiences and generates feedback about the impact of any action, creating the motivation to act.
Hide

Creating accountability

People have high levels of concern about the potential harms that technology can cause. But their experience when they try to report problems is pitiful and many now believe harmful experiences are just part and parcel of life online. 

Read More

Concerns and complaints

There are high levels of concern about technology driven harms, especially around problems people may have experienced or heard about in their daily lives, such as children being exposed to inappropriate material (84%), scams (83%) and bullying (74%). But they still have significant levels of concern around less tangible issues such as decision making by AI (58%), facial recognition technologies (40%) and targeted advertising (39%).

Concern about tech-driven harms

People raised many of these issues unprompted during our discussion groups.

Women and older people tend to be more worried about potential harms. And although people on higher incomes tend to have better digital understanding and feel greater benefit from technology overall, there are few differences between different economic groups around levels of concern.

When things go wrong, people really struggle to report their concerns. Only a third of people (34%) know where to go for help when they experience a problem online.

Even when they do, tech companies are often unresponsive. Only half of those who’ve reported something to a website or app believed that their action was effective. In total, over a quarter of the public (26%) say they’ve reported experiencing a problem online but that nothing happened as a result. 

There’s a clear appetite to make it much easier to raise and resolve these issues. More than half would like more places to seek help (55%) and a more straightforward procedure for reporting tech companies (53%).

A lot of these things have report buttons and stuff but I don’t even know if they’re being manned. Sometimes you can talk to the admin but how long does it take the admin to remove the content? Parents also don’t have any control on how content is removed.

It completely depends on where the issue has fallen. If it’s with a company that’s good and willing to refund you, great. If it’s not, then I think you’re in real hot water because you can’t actually physically go and see them so it is a bit of a fight and, you don’t really know where to take it to.

We’ve not got a choice but to participate in technology, it’s there, we’re drawn into it… often there’s no other option and yet when things go wrong we’re the little person on our own trying to fight the big system and it feels sometimes as though the support is not here.

Digital Disempowerment

Overall the public has a strong feeling of resignation. Two-thirds (67%) say people like them ‘don’t have any say in what technology companies do’. Half (50%) believe it’s ‘part and parcel of being online that people will try to cheat or harm them in some way’. And a third (32%) say they would like to use technology products that better reflect their values, but that these are not currently available.

This is accompanied by high levels of distrust. Only 19% believe tech companies are designing their products and services with their best interests in mind.

Trust in tech companies

I don’t trust them at all. I think they’re not looking at my best interests, they’re looking at their own best interests and trying to provide whatever they can to get me to give my details over, pass on my details, sell my details. Tom, Dick and Harry are getting in touch. Constantly I’ve got about 3,000 emails, half of them are just like scam emails.

Five years ago I would have actually scored them much higher on trust. Like Facebook say, in 2007 or whenever, who knew what it would have turned out like? This past few years I trust them less and less and less!

People have serious and legitimate concerns about online harms but little recourse when they experience problems. It’s vital to address this power imbalance with effective systems of redress that take these concerns seriously and offer resolution for things that go wrong. 

Doteveryone has carried out extensive research into redress and created a prototype online resolution service. Through this work we’ve found that there are underlying principles to a good digital redress process; there’s a need for collective redress; and there is a pool of expertise in civil society organisations that could be unlocked to support the public. These findings are described in detail in our report, Better Redress in the Digital Age.

We recommend all tech companies create accessible and straightforward ways for people to report concerns and provide clear information about the actions they take as a result.

The incoming Online Harms legislation promises to regulate how companies within scope handle complaints. If done robustly, this has the potential to directly and tangibly benefit the public and build confidence that regulation is proving effective.

We recommend the incoming online harms regulator provide robust oversight of companies’ complaints processes founded on seven principles of better redress in the digital age:

  1. Design that’s as good as the rest of the service
  2. Signposting at the point of use
  3. Simple, short, straightforward processes
  4. Feedback at every step
  5. Navigating complexity
  6. Auditability and openness
  7. Proportionality

We also recommend that digitally-capable super complainants should act on the public’s behalf to demand collective redress from technology-driven harms and channel unresolved disputes between individuals and companies. 

And we call on the Government for financial support to unlock the expertise of civil society to support people to address the impacts of technology-driven harms on their lives. Coordinated action between charities and support groups can help people to seek redress and encourage improved understanding of the nature of online harms.

Hide