Researchers: Users shouldn’t be on the hook for data security

"As things stand, we fully empower companies like Google and Facebook because we often accept their terms to maintain our social and digital lives. However, we are being exploited by companies that use our data to target advertisements about different products and so on. We're told that we have all the power to choose, while in reality, we just don't." (Credit: Getty Images)

Users should not ultimately be the ones responsible for protecting their data, as the long, cryptic privacy settings on many apps suggest, researchers conclude in a new study.

In reality, we have almost no power over our data and who uses it for what, according to the study.

The entire notion that we ourselves have the power to protect our data is both unrealistic and unfair, says coauthor Irina Shklovski, a professor in the University of Copenhagen’s computer science department.

“It is utopian to expect that we ourselves should be able to understand what it means when, for example, we click on ‘accept cookies’. Because, what exactly have we agreed to? The vast majority resign and allow consent, which makes sense, because in reality, we have almost no power over our data. Furthermore, I believe that it is far too great of a responsibility to place upon individuals, that we ought to manage our data ourselves,” says Shklovski.

Along with her American colleagues, Shklovski interviewed 25 adults, asking the respondents to reflect on a range of data protection scenarios. The study is also based upon more than 100 research papers on the subject.

“Our analysis demonstrates that many people don’t know what their choice means and what rights they have with regards to their data. At the same time, they feel an enormous responsibility to protect their privacy. Many feel guilty that they aren’t doing enough to opt out. Furthermore, our study respondents felt that they needed to choose between either protecting their data or being socially excluded,” says Shklovski.

Despite having the right to click “do not accept” to cookies and not share photos and other private information with companies, the choice can have major implications for our social lives.

Technology and relationships are tightly interwoven. To use Facebook, one needs to accept that the company is, in principle, entitled to everything that one writes and shares on their platform.

“You might think—but I can just do away with Facebook? Sure, you can. But will everyone else do so too? If it’s just you, one runs the risk of being left out of social events. To make an example, a participant from the study described how Facebook was her only way of finding out about events and activities at her church. That makes it hard to just quit Facebook because you are left out socially,” says Shklovski.

Correspondingly, our work lives are deeply rooted in technologies that require us to submit information about our private lives.

“I rely on Google Docs, where my colleagues and I share articles and comment upon and write in each other’s texts. The program recently asked me for my birthday. Thankfully, I could say no. But, then the program wouldn’t open. So, we have a choice, but it’s incredibly limited and I ended up entering my date of birth. But I find it uncomfortable to share this kind of information,” says Shklovski.

So far, the debate surrounding data protection has been mostly about whether it is possible to design apps that, with shorter and more easily understood text, can explain what a person accepts when they select “ok” to accept cookies. According to Shklovski, however, it shouldn’t be up to the individual to protect their data.

“Of course we have a responsibility for what we post on social media, for example. And, we need to think about that. But I fundamentally believe that, at present, there is too much responsibility placed upon individuals—we must stop deluding ourselves into believing that we’re all legal experts with enough time and knowledge to understand these lengthy descriptions about data law. Instead, we should create democratic institutions or associations which ensure that companies comply with rules that guarantee user privacy,” she says.

“As things stand, we fully empower companies like Google and Facebook because we often accept their terms to maintain our social and digital lives. However, we are being exploited by companies that use our data to target advertisements about different products and so on. We’re told that we have all the power to choose, while in reality, we just don’t.”

Source: University of Copenhagen