Widespread data collection practices lead to self-censorship and discrimination even though most users are not fully aware of how much their privacy is being infringed, a parliamentary committee has been warned.
On Wednesday, the human rights committee, beginning its inquiry into the right to privacy and the digital revolution, published evidence from privacy and data protection organisations including the Information Commissioner’s Office, Liberty and Privacy International.
Taken as a whole, the submitted evidence paints a picture of a nation that does not understand what happens to its data, cannot give meaningful consent to how it is used, and ends up self-censoring for fear of being watched.
Liberty said in its submission: “That private companies exploit our data for commercial purposes is now a normalised part of our everyday existence. The data collected can reveal and manipulate our deepest and most sensitive thoughts and feelings – including our political views.
“The normalisation of these processes also threatens our freedom of expression and association by making it clear that we are being watched. Studies have shown that we are likely to censor what we post on social media or what we look up online when we are aware they are being surveilled.”
The ICO warned that the modern data economy runs the risk of embedding discrimination in the fabric of society. When a company uses “lookalike audiences”, for instance, it uses algorithms and analytics to identify those individuals most like to be similar in characteristics to those it wishes to target.
“There is growing evidence that inherent biases are built into algorithms resulting in the risk of discriminatory outcomes, which runs contrary to the principle of fairness. Additionally, the principles of consent, transparency and accountability are all engaged by this activity,” the ICO said.
The Law Society of Scotland warned that, while the EU’s general data protection regulation (GDPR) goes a long way to protect citizens from abuses of personal data, “consumers may not fully understand the potential impact that certain uses of their data might have”. It suggested education about use of data consent might be important. “Relevant issues to consider in this context include: the (increasing) complexity of algorithms; extent to which personal data is shared with other organisations; and the extent to which personal data is collected, commoditised and used for personalised marketing.
“While some actors uphold the principles of transparency and offer real choice to consent, we are aware that others are still not meeting the basic requirements of the UK data protection legislation and do not offer consumers real choice,” it said.
Privacy International warned that the focus should not just be on data that was collected, since many companies accurately inferred far more personal information than they collected: “Companies routinely derive data from other data, such as determining how often someone calls their mother to calculate their credit-worthiness.
“As a result, potentially sensitive data can be inferred from seemingly mundane data, such as future health risks. Combined data can reveal people’s political and religious views; socioeconomic status; finances; spending habits; physical and mental health; relationships status; sexual preferences; family planning; internet browsing activities; and more.”
Privacy International added: “Combining data may expose patterns of behaviour people themselves are not aware of and highly sensitive information that they did not knowingly provide.”
The inquiry will begin taking oral evidence over the coming months, before reporting on whether Britain needs new safeguards to regulate the collection, use, tracking, retention and disclosure of personal data by private companies.