As more of our everyday life takes place on the internet we inevitably share more of our personal data with online businesses, something which could be resulting in growing apprehension over the privacy of our data.

A recent survey by GSMA Intelligence found 70 per cent of consumers are concerned about the privacy of their data and 48 per cent say their concerns are growing. Worries relate specifically to identity theft (59 per cent cited it as their biggest anxiety) with 45 per cent most troubled about financial impacts.

And yet, even after suffering a security breach, the same survey shows nearly half of consumers are failing to take the most basic precaution of changing a password, 72 per cent are failing to add a second layer of security, and 69 per cent are failing to change their privacy settings.

People appear to be doing very little about their privacy concerns, so what is the reason for this inertia? Do we really care about what happens to our data?

The answer depends on a number of dynamics. Whether consumers are aware of the risks. Whether they consider themselves responsible for their data. And whether they have access to tools and settings for easily managing it:

  • Privacy risks: It seems a large majority (70 per cent) are aware of the risks of sharing data, at least when asked in a survey, but those risks may not be top of mind when people are desperate to get their hands on compelling digital content or services. In that moment they may not think about the risks, especially when only 22 per cent have experienced a security incident in the past two years.
  • Responsibility: It is also perfectly valid to care about data privacy but do nothing about it because it’s considered someone else’s responsibility. Here, opinion is divided: 20 per cent consider the online business which handles the data to be responsible, while only 18 per cent consider themselves responsible for securing their data.
  • Tools and settings: Of course, if consumers do care about their data privacy they need easy access to privacy settings and tools enabling them to manage it. However, in the UK alone, there are more than 600,000 businesses registered to process personal data and, on average it is estimated consumers have as many as 200 online login accounts each. Effectively managing privacy settings and permissions for many hundreds of websites and apps becomes an overwhelming task, unfeasible without tools to automate and centralise the process.

The answer, then, isn’t that people don’t care about privacy but rather that consumers don’t seem to care enough at key touch points when they consciously share their data.

For instance, most Europeans are already blithely clicking through websites’ ubiquitous cookie consent notices, 57 per cent of which, a recent study by the University of Michigan and Ruhr University showed, use design to bias the agree button to accept privacy-unfriendly defaults.

Consumers, doubtless, should care more, especially as the threat to data privacy will get worse with growing deployments of AI and machine learning. A common approach with AI and big data deployments, after all, is to vacuum up more data than is required for a particular purpose, often within opaque black-box algorithms. This goes against several clauses in the EU’s General Data Protection Regulation (GDPR), not least the seven principles of Article 5 which include purpose limitation, data minimisation and storage limitation. So regulations exist for AI, but policing it is a big problem for central authorities and without integrating consumers’ permissions into algorithms by design, privacy controls become out of reach for authorities and the consumer, whether they care or not.

If we also consider poor digital literacy in certain demographics and regions, and the power of compelling content to override privacy concerns, a consumer’s mind-set of careless resignation is only set to continue.

Consumers’ inability to match active care with their vociferous privacy concerns is a problem which will only get worse as we connect 1.2 billion more consumers in the next five years to the mobile internet, many of them in developing countries where privacy awareness, responsibility and access to management tools can be even lower. Alternatively, we could see a new generation embrace digital wellbeing (described by Google as a better balance with technology) and demonstrate an appetite to actively care about privacy, protecting their identity from theft, protecting their net wealth from exploitation, and guarding their knowledge against fakery.

Driving this awareness, however, will be the responsibility of the entire industry: all signs suggest consumers won’t simply get more interested in their privacy on their own.

– Mark Little – senior manager, Consultancy, GSMA Intelligence

The editorial views expressed in this article are solely those of the author and will not necessarily reflect the views of the GSMA, its Members or Associate Members.