Facial recognition & privacy

Transparency is a fundamental principle under GDPR.

Transparency is a fundamental principle under GDPR. This is about being clear, open and honest about how an organisation might be using your personal data and this must be communicated to individuals in a way that is accessible and easy to understand.

Privacy rights and apps

The FaceApp challenge has swept the internet over the last month, with many celebrities uploading digitally altered images of their faces. This has influenced many people to take part in the #FaceAppChallenge obtaining online likes from their followers on social media.

The controversy surrounding the App, and the privacy issue, raises the point that users need to be more aware of what they are signing up to when they download an App. It should be highlighted that FaceApp isn’t the only App that uses user data and is similar to the privacy policy of many other Apps and tech services. Research from global cyber security company Kaspersky Lab shows that 47% of people don’t check the permissions on Apps before installing them. Often users accept pop-up permission clauses without reading them. In some cases this can result in granting unlimited access to mobile content, which would be inconsistent with GDPR’s principal of data minimisation.

The Information Commissioners Office (ICO) recommends that, in order to protect their personal information, all users should carefully read an App’s permission requests before committing to downloading it. In order to take control of personal information on social media Apps, users should check the privacy and advertising settings before using a particular service. The ICO has useful factsheets covering Facebook, Snapchat, Twitter, LinkedIn and Google to assist the public to change their advertising settings on a range of platforms.

In a world which is now driven by data and technology, it is increasingly important to know what organisations are doing with your personal data. The ICO is currently considering the criticisms of FaceApp and advising people to be clear on what is happening with their personal data when using Apps. GDPR requires that organisations must be transparent about the handling of personal data and be clear about how personal data is being used. The controversy surrounding FaceApp has certainly helped to highlight the issue of personal data, which may influence people to be more aware of what rights they are granting when they download an App.

Facial recognition technology

Another high priority area for ICO is the use of facial recognition technology. This technology collects information about a person’s facial features, which is classified as biometric data. Any organisation that might be using, or considering a system that uses, biometric data needs to be aware of the potential challenges.

Manchester City FC has recently been warned against installing facial recognition technology in an attempt to allow fans to enter the stadium by showing their ‘faces’ instead of tickets. This controversial move has been highly criticised by civil rights group Liberty and comes just after the news that the ICO is inspecting the use of the facial recognition technology at Kings Cross, Central London. Elizabeth Denham, Information Commissioner, stated that “scanning people’s faces as they lawfully go about their daily lives, in order to identify them is a potential threat to privacy that should concern us all.” She highlighted that this technology is usually used without consent and an investigation has been launched.

In Sweden, a school conducted a pilot using facial recognition technology to monitor student attendance. The Swedish Data Protection Authority (DPA) has just served its first fine of 200,000 SEK (approximately 20,000 euros) after concluding that this pilot had breached GDPR. The school had been processing sensitive biometric data and had failed to consult with the Swedish DPA by completing an assessment. The school had argued that the pilot was based on ‘consent’. However the DPA concluded that there was a ‘clear imbalance between the data subject and the controller’.

The ICO says that any organisation using this technology must be transparent and accountable and must document why the technology is being used. Under GDPR, organisations have an obligation to conduct a Data Protection Impact Assessment (DPIA) to help eliminate any risks. Organisations have access to a DPIA template through the ICO, or they could develop one more specific to their organisation’s needs.

Organisations need to be aware that, by using facial recognition technology, they are essentially processing personal data. The ICO has a comprehensive list which gives examples of ‘processing’ which are deemed high risk, such as innovative technology or large scale profiling. Examples of this could be workplace access systems/identity verifications, fingerprint recognition or data gathered by smart meters.

If you have any questions or would like to discuss any aspect of this article, please contact Rebecca Ellis, Trainee Solicitor at rebecca.ellis@weightmans.com or 0141 404 9312.

Share on Twitter