By Joe Green

Imagine you’re on a busy train texting a friend when you notice a stranger sneakily reading your texts from over your shoulder. Your natural response would probably be to recoil as you realise your privacy is being invaded. So why is it that when we’re online and under the watchful eye of data-hungry corporations, most of us wouldn’t think twice about readily sharing that same information?

Privacy Paradox

When asked directly, most people report valuing their personal privacy and being intent on protecting it, but research shows that those same people will often forgo it in exchange for more convenient ways to work and socialise. This is known as the ‘Privacy Paradox’; people claim to care about their personal data but then do very little to protect it online.

Some theories have explained this contradiction by claiming that users are simply making reasoned cost-benefit privacy calculations, while others suggest that we are falling victim to a range of cognitive biases. However, in a recent paper Azim Shariff, William Jettinghoff and I argue that a more complete understanding of this phenomenon requires looking back to our evolutionary roots.

The Evolution of Privacy

Broadly speaking, privacy functions to selectively control access to oneself or one’s group. We suggest that within our ancestral past, the role of privacy has been to help protect our body, territory, and reputation while around others. Consequently, millions of years of face-to-face interaction have seen us develop a toolkit of privacy-based intuitions which help to manage both our physical and psychological boundaries. Yet these evolved intuitions are often ill-equipped to deal with the emerging challenges of the digital environment.

An ‘evolutionary mismatch’ describes the negative consequence that occurs when a trait that evolved within one environment enters another. A classic mismatch example is our fondness for high-calorie foods – this was an adaptive trait within the ancestral environment where such foods were nourishing yet sparse. But today, when there’s a McDonald’s on every high street, this same trait can lead to obesity and diabetes. And so it is with privacy traits, too. Below we describe three related mismatches:

Ownership Psychology

Working out who owns what can be a tricky business, however, humans have developed a set of ownership conventions to guide this process. Important cues like ‘who first possessed an object’ help to intuitively discern ownership within the interpersonal environment, but these cues often gets blurred online. For example, who is the first possessor of a Facebook user’s data, Facebook or the user? This ownership ambiguity often leaves us unsure about what is ours and whether we ought to then protect it.

Personal Space

Physical privacy enables us to navigate social space without becoming overstimulated. But within an online space, social cues relating to size, nature, and proximity of onlooking crowds tend to disappear. As a result, we’re often worse at managing self-disclosure: sending that regrettable late-night tweet to your 200 followers is a lot easier than announcing it to an in-person crowd of 200.

Reputational Concern

Giving off a good or bad impression comes with social rewards and repercussions. As such, we instinctively modify what we say or do around others. For instance, if you had an Amazon employee sat in your living room recording your every utterance – unlikely, I know – you’d be wary about what you were saying. But when Amazon’s smart speaker Alexa is doing just that silently in the background, chances are that you talk freely and without a filter.

Many modern technological surveillance devices lack the anthropomorphic cues necessary to trigger our instinctive reputational concern and downstream privacy behaviour.

A Future

E.O. Wilson once described us as having, “Stone Age emotions, medieval institutions, and godlike technology”. Indeed, expecting Stone Age brains to self-manage personal data in the ever-changing digital environment is both unreliable and unrealistic. Reading the privacy policy for every website you visit in a year alone would take roughly one month! Instead, online culture and expectations need to change ­– we as consumers must demand more top-down privacy protection from these godlike technologies. Afterall, none of us want someone looking over our shoulder.

Joe Green is a PhD student under the supervision of Dr Matt Easterbrook. Joe’s research studies how moral intuitions affect our attitudes towards modern issues like Universal Basic Income, automation, and online privacy. 

Find out more about our research on Social and Applied Psychology.



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here