ICO and EU have conflicting views on anonymised data

Channel News

Everyone needs to agree say lawyers

Conflicting views within the EU threaten to undo some of the work the ICO has taken towards helping businesses reap greater rewards from ‘big data’ processing.

Luke Scanlon, a technology lawyer with Pinsent Masons, the law firm behind Out-Law.com, wrote that IBM estimates that every day 2.5 quintillion bytes of data are produced. However,  he pointed out that research by Economist Intelligence Unit suggested that over 75 percent of businesses were wasting more than half the data they already hold.

Mr Scalon said a lot of this wastage was caused by uncertainty and a lack of understanding about the precise requirements of EU data protection and security rules.

According to the lawyer, the EU law on data protection says that personal data labelled as anonymous is not protected by data protection laws. He said this raised a problem for businesses, which had to think about how much they could show through internal risk assessment processes that they have effectively anonymised data.

However, this raised the risk of “possible ‘re-identification’ or ‘the unanonymising of data’ “through matching data released by one company with other data that may be in the public domain.”

However the ICO has put new guidance in place, which could contradict this.

Last week it outlined a  new code of practice on anonymisation, which states that organisations that anonymise personal data can disclose that information. This is even if there is a “remote” chance that the data can be matched with other information and lead to individuals being identified.

The watchdog added that businesses which look at mitigating the risk of anonymised data being used to identify individuals will be considered to have complied with the Data Protection Act (DPA). This is even if that action cannot eradicate the threat of the data being used to identify someone.

Mr Scalon said that other EU member state regulators should be encouraged to advise businesses to take a similar approach to the UK on anomymisation of data.

However, he pointed out that initial discussions with German regulators and academics showed that other countries had ideas to take a more restrictive approach.

He referenced Out-Law.com’s initial discussions with German regulators, which suggested that “the possibility of data being released and then being matched to other data obtain by illegal means would not count just as a remote risk, but as one that must be assessed by every business before they before they conclude that data has been truly anonymised.”