Press enter to search, esc to close
Data protection and cybersecurity news has rarely left the headlines since the GDPR came into force in 2018. To mark Data Protection Day 2020, we’re reviewing some of the major developments over the last year and what they might mean for the future.
The ICO announced its intention to levy the first significant GDPR fines in July 2019, in response to significant cybersecurity incidents at British Airways and Marriott International. While the scale of these fines is still being decided, enforcement against such well-known companies brought home to many directors for the first time the real commercial risks associated with getting compliance wrong, and the regulatory focus on appropriate security measures for customers' personal data. The conclusion of the ICO’s actions against British Airways and Marriott International, and the approach of other regulators around Europe to issuing fines under GDPR are worth looking out for in 2020.
The use of biometrics is becoming a bigger part of everyday life, from unlocking phones with fingerprints, to iris scanning at airport security gates, to voice recognition when we talk to Alexa or Siri. Many companies are keen to stress the convenience and security advantages of biometrics for consumers (particularly in financial services where it’s playing a big part in payment authentication and fraud detection). However, civil liberties campaigners are keen to highlight the risks associated with biometrics if there are security breaches, and for any organisation deploying biometrics the more stringent 'special category' personal provisions of GDPR will apply. Read more
The ICO’s publication of its Age Appropriate Design Code adds another layer to recent discussions about the responsibilities of technology platforms to protect young users from harm. The 15 standards set out in the document reflect principles that organisations should already be taking into account when providing services to children, but also flesh out the requirements and give a better idea of regulatory expectations. While the Code itself isn’t legally binding it has significant persuasive force, and an organisation choosing not to comply with it is likely to find it harder to demonstrate compliance with the GDPR. Read more
The vast amount of data that businesses hold about consumers makes it easier than ever to personalise and target ads to generate maximum engagement, leading to an explosion of the adtech industry. It’s worth remembering, however, that the first fine under the GDPR related to Google’s advertising arrangements, and the ICO continues to investigate data protection practices in adtech, following its damning report in July 2019 indicating severe compliance issues. While the ICO is still encouraging significant improvements in self-regulation by the marketing sector (particularly around security, data minimisation, data retention, and real time bidding arrangements), if that doesn’t come quickly 2020 is likely to see much tougher regulatory intervention.
In November 2019, in response to a number of high profile cases in which police forces used LFR in public, the Information Commissioner issued her first formal Opinion on the subject. The Commissioner called on government "to introduce at the earliest opportunity a statutory binding code of practice to provide further safeguards that address the specific issues arising from the use of biometric technology such as LFR". The news in early January that another UK police force plans to begin operational use of LFR has stirred up opposition from privacy campaign groups, though the force has defended itself, pointing to their ‘transparent approach’ and the value of the technology in assisting the police in combatting serious crimes. With significant public interest in the issue and pressure on government to act on the ICO’s recommendations, the likelihood of a statutory code of practice for police use of LFR seems high.
In the rush to respond to a growing cyber threat, organisations of all sizes have been equipping themselves with the resources and expertise necessary to address privacy and cyber risks. However, organisations must consider taking a joined-up approach. A coordinated strategy with an accountable cyber security and data privacy leader in place will deliver greater resilience against attacks and data loss, and provide a much better response should an incident occur. Read more
The issues and challenges surrounding data protection will inevitably continue in 2020, and organisations will need to adapt to changing regulations and processes, particularly in the wake of the possibility of new restrictions on data transfers depending on the shape of any Brexit deal.
We are awaiting judgment on several key cases which will be key to informing how things progress, most obviously the Morrison's data breach case (which was heard by the Supreme Court in November 2019) and the CJEU judgment on whether the standard contractual clauses used by Facebook to transfer personal data from Ireland to the US were valid. Read more
The European Commission is also in the midst of reviewing the functioning of the GDPR, with the aim of submitting a report on its review by 25th May 2020. In its submissions to the Commission the European Council has suggested that more practical guidance from the European Data Protection Board, sector-specific codes of conduct, and greater co-operation between competition, consumer and data protection authorities in regulating big tech, should be high on the agenda. It would be a surprise if the Commission doesn’t take a similar view.
This publication is intended for general guidance and represents our understanding of the relevant law and practice as at January 2020. Specific advice should be sought for specific cases. For more information see our terms & conditions.
28 January 2020
by Emma Erskine-Fox