Biometrics and emotion analysis risk increasing discrimination, ICO warns

UK regulator to issue guidance on deployment of “immature” technologies.

The UK Information Commissioner’s Office (ICO) is warning organisations about using emotion analysis technologies without accessing the public risks. The regulator fears that, without sufficiently developed algorithms to detect emotional cues, there is a risk of systemic bias, inaccuracy and discrimination.

“Developments in the biometrics and emotion AI market are immature. They may not work yet, or indeed ever. While there are opportunities present, the risks are currently greater. At the ICO, we are concerned that incorrect analysis of data could result in assumptions and judgments about a person that are inaccurate and lead to discrimination”, said Stephen Bonner, Deputy Commissioner.

Emotional analysis technologies process data such as gaze tracking, sentiment analysis, facial movements, gait analysis, heartbeats, facial expressions and skin moisture. The analysis relies on collecting, storing and processing a range of personal data, including subconscious behavioural or emotional responses, and sometimes ‘special category data’. That kind of data use is far more risky than traditional biometric technologies that are used to verify or identify a person, the ICO says.

Another risk is that if your biometrics data is lost or stolen, you cannot easily change your fingerprint, face or retina.

Face recognition

Biometric technology is advancing rapidly and is likely to become part of everyday life. Biometric technologies are frequently used by financial companies who use facial recognition to verify identities through photo IDs or a selfie, and airports are aiming to use biometric technologies to streamline passenger journeys through facial recognition at check-in, self-service bag drops and boarding gates. The use of voice recognition is also growing.

The ICO says it is critical that regulators, firms and policymakers understand the potential challenges and opportunities for data protection in order to prepare for the biometric future. Those who doesn’t act responsibly, fail to meet ICO expectations, or pose a risk to vulnerable people will be investigated.

“The ICO will continue to scrutinise the market, identifying stakeholders who are seeking to create or deploy these technologies, and explaining the importance of enhanced data privacy and compliance, whilst encouraging trust and confidence in how these systems work”, said Bonner.

Practice guidance

To shape good standards around emotion analysis technologies practices, the ICO is currently developing guidance on the wider use of biometric technologies. Two reports have been published recently, Biometrics: insight and Biometrics: foresight, with the aim to support businesses navigating the use of emerging biometrics technologies.

The new guidance will highlight the importance of data security, and covers key elements of gathering and using biometric data, including:

  • The need to clarify key terminology and definitions surrounding biometric technologies and data.
  • The increased use of biometric technologies for classification and where this sits under existing data protection legislation.
  • The need for compliance with transparency and lawfulness requirements when processing ambient data.
  • The need to understand and appropriately manage high risk biometric technologies, such as emotional AI.

“The only sustainable biometric deployments will be those that are fully functional, accountable and backed by science. As it stands, we are yet to see any emotion AI technology develop in a way that satisfies data protection requirements, and have more general questions about proportionality, fairness and transparency in this area”, said Bonner.

ICO’s biometric guidance will be published in Spring 2023, and is being produced in liaison with the Ada Lovelace Institute and the British Youth Council. 

Art. 4 (14), UK GPDR

‘Biometric data’ means personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data;

Art. 9 (1), UK GPDR

Processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person’s sex life or sexual orientation shall be prohibited.

Source: gdpr-info.eu