The privacy paradox: If we care about our online privacy, why don't we do more to protect it?
Douglas White, head of advocacy at the Carnegie UK Trust, on how to define, value and better protect our online privacy
Image credit: Fotolia
If you have ever absentmindedly clicked ‘accept’ on terms and conditions without reading, reused the same password across multiple sites or gone months before checking your privacy settings, you are not alone. At the same time, you may also be part of the nearly three quarters of people in the UK who are concerned about their online privacy.
While this apparent disconnect between our attitudes and actions is not unique to data privacy, it becomes increasingly clear with every passing week that our ability to make informed decisions about our data is an issue that is worthy of greater consideration.
With Facebook’s Cambridge Analytica scandal and the implementation of GDPR legislation, there’s been a marked increase in focus this year on global privacy and the regulatory landscape. The question of how we define, value and better protect our privacy in the digital age has arguably never been a more significant and complex challenge.
Over the last 10 months, we’ve been working with leading research firm Ipsos MORI to analyse 50 pieces of evidence published within the last three years on citizens’ attitudes and behaviours towards online privacy in the UK, including data from public, private and academic institutions.
While our analysis of the available research shows that most people do care about privacy online, many of us fail to take the necessary steps to better protect our personal data. This may be because we don’t know how to, choose not to, are not fully aware of the possible risks, or feel powerless to do so. The mismatch between attitude and action has been coined by researchers as the ‘Privacy Paradox’.
The evidence highlights some important issues that require further exploration. For example, people generally say they are more comfortable sharing data with public sector organisations than private companies – but how does this play out in practice? Do we genuinely have the desire or opportunity to regulate our interactions with companies in line with these attitudes; and how do public bodies build on this apparent trust in order to make the best use of data for public good, while effectively mitigating the risks?
There are also important variations between demographic groups. Younger people appear to be more privacy conscious than older people in many scenarios – but in other contexts the opposite is true. Meanwhile people from more deprived backgrounds appear more likely to be exposed to privacy risk than more affluent households. This raises a question of how people are supported to develop the necessary skills and confidence to navigate privacy considerations online, and whether this support is appropriately tailored to the needs of different groups.
The research also investigated data trade-offs. Many online products and services can be accessed in exchange for data about our behaviour or interests. Unsurprisingly the evidence shows that people in the UK make numerous such trade-offs for their data, including for access to free or discounted products or services, a better more personalised service or simply through lack of an alternative.
But the price of access to the online world is not always free and the cost is not always obvious. As the ‘old’ saying goes – if the product is free, then you are the product. However the story isn’t always quite as one-dimensional as headlines can often present it. Data trade-offs are made by people for a variety of reasons beyond purely personal gain, including for example allowing access to our data to support action in the ‘public good’, such as medical advancements.
Our review found that the issue of privacy online is itself not extensively researched, at least in comparison to other matters of public concern. As a relatively recent concept (the online aspect anyway), there is limited trend data available to track how public attitudes and actions on online data privacy have changed over the years. There are gaps in the evidence in terms of the views of different groups in society – including ethnic minority communities and disabled people. Meanwhile terminology is applied different across different studies; and much of the evidence is based on people’s personal perceptions, rather than more objective measures. More and better evidence will be required in the future, to help inform our approach to these issues.
Interestingly, we also found that even though people are concerned about data privacy and don’t necessarily act to protect this, most of us are quite confident in our ability to manage our privacy online.
The challenge now is to identify the steps we need to take as a society to ensure that this confidence is not misplaced and enable everyone to have the information and skills they need to make the right decisions for them about what data they share and how it is used.
Douglas White, Head of Advocacy, Carnegie UK Trust
The NCSC warned that “it’s important to apply these updates quickly, to make it as hard as possible for attackers to get in”
Ciaran Martin, the chief executive of the NCSC, stated publicly that the UK suffering a category-one cyberattack is “a matter of when, not if”
Scottish Futures Trust publishes new business and corporate plans with aim to...
Open Standards Board concludes process for selecting standards to be applied to the use of “cyberthreat intelligence systems”
With the annual worldwide cost of cybercrime set to double from $3tn in 2015 to $6tn by 2021, BT offers advice on how chief information security officers can better...
Vodafone explores some of the ways IoT is significantly improving public sector service delivery
BT's Amy Lemberger argues that having the right security in place to protect your organisation is no longer just an option. It is a necessity.
BT explores how to manage the risks and rewards of the cloud in their infographic guide, offering advice for ensuring that the challenges don't hold you back