As more of our everyday life takes place on the internet we inevitably share more of our personal data with online businesses, something which could be resulting in growing apprehension over the privacy of our data.
A recent survey by GSMA Intelligence found 70 per cent of consumers are concerned about the privacy of their data and 48 per cent say their concerns are growing. Worries relate specifically to identity theft (59 per cent cited it as their biggest anxiety) with 45 per cent most troubled about financial impacts.
And yet, even after suffering a security breach, the same survey shows nearly half of consumers are failing to take the most basic precaution of changing a password, 72 per cent are failing to add a second layer of security, and 69 per cent are failing to change their privacy settings.
People appear to be doing very little about their privacy concerns, so what is the reason for this inertia? Do we really care about what happens to our data?
The answer depends on a number of dynamics. Whether consumers are aware of the risks. Whether they consider themselves responsible for their data. And whether they have access to tools and settings for easily managing it:
The answer, then, isn’t that people don’t care about privacy but rather that consumers don’t seem to care enough at key touch points when they consciously share their data.
For instance, most Europeans are already blithely clicking through websites’ ubiquitous cookie consent notices, 57 per cent of which, a recent study by the University of Michigan and Ruhr University showed, use design to bias the agree button to accept privacy-unfriendly defaults.
Consumers, doubtless, should care more, especially as the threat to data privacy will get worse with growing deployments of AI and machine learning. A common approach with AI and big data deployments, after all, is to vacuum up more data than is required for a particular purpose, often within opaque black-box algorithms. This goes against several clauses in the EU’s General Data Protection Regulation (GDPR), not least the seven principles of Article 5 which include purpose limitation, data minimisation and storage limitation. So regulations exist for AI, but policing it is a big problem for central authorities and without integrating consumers’ permissions into algorithms by design, privacy controls become out of reach for authorities and the consumer, whether they care or not.
If we also consider poor digital literacy in certain demographics and regions, and the power of compelling content to override privacy concerns, a consumer’s mind-set of careless resignation is only set to continue.
Consumers’ inability to match active care with their vociferous privacy concerns is a problem which will only get worse as we connect 1.2 billion more consumers in the next five years to the mobile internet, many of them in developing countries where privacy awareness, responsibility and access to management tools can be even lower. Alternatively, we could see a new generation embrace digital wellbeing (described by Google as a better balance with technology) and demonstrate an appetite to actively care about privacy, protecting their identity from theft, protecting their net wealth from exploitation, and guarding their knowledge against fakery.
Driving this awareness, however, will be the responsibility of the entire industry: all signs suggest consumers won’t simply get more interested in their privacy on their own.
– Mark Little – senior manager, Consultancy, GSMA Intelligence
The editorial views expressed in this article are solely those of the author and will not necessarily reflect the views of the GSMA, its Members or Associate Members.Subscribe to our daily newsletter Back