It Turns Out People Are Better at Protecting Their Privacy Than Companies Would Like

JuliusKielaitis/Shutterstock.com

When social networks force openness, users become more guarded in what they share.

The struggle between Facebook, Google and their users has led to an unexpected result, contends a new book on privacy: Every time social networks force openness on their users, people become much more guarded in what they share, leading internet giants to push for yet more openness. This is the argument made by three academic researchers, Antonio Casilli, Yasaman Sarabi, and Paola Tubaro, in their new book, “Against the hypothesis of the end of privacy.“

The researchers find that the end of privacy is only one of the possible results of the way online behaviour is evolving, and not the mostly likely one. At the heart of their argument is that users aren’t inert. Far from accepting a steady, linear erosion of their privacy, users of social networks react to changes by over-protecting their privacy. Every time a network tries to make itself more open, its users—in aggregate if not individually—respond by closing themselves off even more. It is a constant tussle. The authors call these “cycles of privacy.”

But people have control over settings in only a limited number of cases, such as with social networks. They are less powerful when it comes to dealing with the snooping apparatuses set up by the world’s governments. That said, the exposure and subsequent public debate over data collection from western spy agencies’ only boosts users’ caution. And they are increasingly demanding tools that allow them to protect themselves better.

Antonio Casilli, one of the authors of the book and a professor at Telecom Paris Tech, explains it thus: When you just join a network, you want to explore so you expose yourself more. Eventually you realize you’ve been giving away too much so you pull the plug and start over-protecting your privacy. This is the moment when privacy becomes cyclical. If millions of users start overprotecting their privacy, the platform owners have to do something about this. So they come up with policies that reopen the privacy settings that were closed. The reaction then is to overprotect again. And so on.

Casilli and his colleagues came to their conclusions by using agent-based modelling, a computer simulation that can mimic interactions between individuals and groups. Casilli says that they didn’t use real-world data because it is both hard to obtain from corporations, and comes with strings attached when it is made available. Samples of data could have helped calibrate the researchers’ model. But Casilli says other methods can also improve reliability, such as checking the results back with past events with known outcomes. 

“If you think of the end of privacy discourse, it is aways something that is presented in a linear way,” says Casilli. “We were surprised by the cyclicality of the results.”

(Image via JuliusKielaitis / Shutterstock.com )