Keep up to date with data protection and cybersecurity news with our round-up.
We cover some of the key developments over the last couple of months, what's on the horizon and how to keep your business compliant and prepared.
In this round-up we report on:
The ICO has issued guidance on human bias and discrimination in Artificial Intelligence (AI) systems in the data protection context. The guidance explains how flaws in the data used to train and test such systems can result in algorithms that treat people less favourably on the basis of certain characteristics, such as disability, race and gender. The guidance also clarifies what organisations can do to manage the risk of discriminatory outcomes in AI systems.
If your organisation processes personal data using AI technology, you should ensure that a Data Protection Impact Assessment (DPIA) is conducted for the processing of personal data that is likely to result in a high risk to individuals. You should also undertake robust testing of any AI systems you use, monitor their performance on an ongoing basis and establish clear and effective policies for the collection of data embedded in any such systems. The approach you take to discrimination risks associated with the use of AI systems will need to be documented from the start of any AI application lifecycle to ensure that appropriate technical measures to mitigate such risks are put in place during the design and build phase.
The guidance forms part of the ICO's ongoing call for input on developing its framework for auditing AI. The ICO stated that it is in the midst of engaging with organisations on this work and that it plans to publish a formal consultation paper by January 2020. The ICO's work on AI also includes guidelines to assist organisations explain decisions made by AI to the individuals affected.
The European Commission has also published a factsheet on the EU's role in AI.
The ICO confirms that consent is not required for cookies that are 'strictly necessary' for the operation of an online service, but cookies which are not essential require the consent of the users of the service. Examples of cookies that are 'strictly necessary' cookies include cookies that track user input for specific functions of the service (e.g. a shopping basket) or cookies that are needed for the security of a website (such as those used to detect repeated failed login attempts). However, cookies used for analytics purposes or first and third-party advertising are unlikely to be 'strictly necessary'.
If your organisation hasn't conducted a cookie audit recently, it would be advisable to do so in order to identify which cookies are strictly necessary for the operation of your online services and to ensure that consent is obtained for those that are not essential. You will need to make users aware of cookies when they first visit your online service and update your cookie policies to ensure they are in line with the guidance.
The first significant GDPR fines hit the headlines in July following the announcement by the ICO of its intention to fine both British Airways and Marriott International for data breaches resulting from cybersecurity incidents.
The proposed BA fine of £183.39 million results from a cybersecurity incident involving user traffic to the BA website being diverted to a fraudulent website. As a result, the personal data of approximately 500,000 customers were compromised. More information on the proposed BA fine can be found on our insight. The proposed Marriot fine of £99.2 million relates to the exposure of approximately 339 million guest records as a result of the vulnerability of Starwood hotels group's cybersecurity systems. The breach is believed to have begun before Marriot acquired Starwood, but the exposure of customer information was not discovered until after the acquisition was completed.
Both companies will have the opportunity to make representations to the ICO before it makes its final decision about the imposition of the fines.
It goes without saying that the fines show the importance of having appropriate security measures in place to protect customers' personal data from loss, damage or theft. The Marriott fine also highlights the importance of undertaking proper due diligence when making a corporate acquisition. As required by the accountability principle, organisations should assess not only what personal data has been acquired but also how it is protected.
The European Data Protection Board (EDPB) has adopted guidelines on the processing of personal data through video devices, which are open for consultation until 9 September 2019. Some of the key issues addressed in the guidelines are:
Lawfulness of processing: Consent is unlikely to be an appropriate legal basis for systematic monitoring, since the technology usually monitors an unknown number of people at once without prior consent.
Legitimate interests is likely to be the most appropriate legal basis for processing video footage (e.g. for surveillance purposes). If your organisation is relying on this basis, you must ensure that video surveillance is strictly necessary for the purpose for which it is used and not speculative. You will also need to assess the extent to which the surveillance has adverse effects on data subjects in order to make sure that your legitimate interests are not overridden by their rights and freedoms.
Special categories of personal data: the EDPB explains when personal data processed via a video device will be treated as special category personal data and provides practical steps for businesses to deal with such data in the course of their processing activities. Businesses should ensure that special category personal data extracted from a digital image will not be excessive and will only contain the information required for the specified purpose. The processing of such data should only be carried out if there is a legal basis and a separate condition for processing.
Disclosure to third parties: businesses also need to establish a legal basis to disclose video footage to third parties.
Adtech and personal data: a recent ICO report highlights the ICO's concerns around the use of personal data in the advertising technology sector. Among other things, the report states that individuals are not adequately informed about what happens to their personal data and that there is a lack of understanding of the requirements relating to the processing of special category personal data. For the next six months, the ICO will continue to engage with the sector to obtain more information on its main areas of concern.
Data sharing code of practice: the ICO has published its draft data sharing code of practice which is open for consultation until 9 September 2019. The code has been revised to reflect changes introduced by the GDPR and the Data Protection Act 2018 and aims to provide organisations with practical guidance on how to share personal data in compliance with the legislation, as well as good practice recommendations.
If you are interested in responding to the consultation, you can do so here.
Reform of cyber essentials scheme: the National Cyber Security Centre (NCSC) has announced that it plans to make changes to the cyber essentials scheme in the UK. The scheme encourages organisations to adopt good cybersecurity practice to protect themselves against common online threats.
Changes include the introduction of new minimum criteria for certification bodies/ assessors and a 12-month expiry date for all certificates awarded under the scheme. If your organisation already uses the scheme or is keen to obtain certification, you should read NCSC's post and monitor developments in this area carefully.
ICO annual report 2018-19: on 8th July, the ICO published its first annual report since the GDPR was implemented, covering key issues such as complaints, the preparation of statutory codes, investigations and fines. Data protection complaints made to the ICO almost doubled since 2017-18 with complaints about subject access requests at the top of the list. Clearly this is an area that organisations should continue to focus on as part of their ongoing compliance programmes.
Active cyber defence: in its active cyber defence report for 2019, the NCSC discusses its strategy and actions taken to reduce cyber-attacks. The focus of NCSC's work in 2018 was on dealing with fraudulent websites and phishing campaigns. Future work includes the development of a new automated system to allow the reporting of suspicious emails and the creation of a web-based tool to help critical national infrastructure providers scan their internet-connected infrastructure for vulnerabilities.
EU Cybersecurity Act: The EU Cybersecurity Act entered into force in the UK on 27th June 2019. It deals with cyber-attacks and aims to build strong cybersecurity in the EU. Under the Act, the European Union Agency for Cybersecurity (ENISA) will coordinate the preparation of an EU-wide cybersecurity certification framework for ICT products and services. ENISA aims to make it easy for businesses to trade across borders and for buyers to better understand the security features of the product or service. More information on this can be found on ENISA's announcement.
This publication is intended for general guidance and represents our understanding of the relevant law and practice as at July 2019. Specific advice should be sought for specific cases. For more information see our terms and conditions.