ICO: Emotional biometrics "risky", may never mature

ICO: Emotional biometrics “risky”, may never mature

The Information Commissioner’s Office (ICO) has warned companies against using biometric technologies for emotion analysis. Some of these systems, which lack development, carry a risk of systemic bias, inaccuracy and discrimination.

Biometric systems used for emotional analysis rely on processing data such as gaze tracking, sentiment analysis, facial movements, gait analysis, heartbeats, facial expressions and skin moisture, collected in a variety of ways – the aim being to judge someone’s emotional state to tailor the service a company offers.

This means a large amount of data is collected and analysed, which the ICO notes is ‘far more risky’ than traditional uses of biometrics to verify or identify a person.

“Developments in the biometrics and emotion AI market are immature. They may not work yet, or indeed ever,” said deputy commissioner Stephen Bonner.

“While there are opportunities present, the risks are currently greater. At the ICO, we are concerned that incorrect analysis of data could result in assumptions and judgements about a person that are inaccurate and lead to discrimination.”

Bonner added, “As it stands, we are yet to see any emotion AI technology develop in a way that satisfies data protection requirements, and have more general questions about proportionality, fairness and transparency in this area.”

The ICO is set to further scrutinise the biometrics market in 2023. The organisation is developing guidance around biometrics and plans to release it in Spring next year, to aid companies looking to adopt technologies such as facial, fingerprint and voice recognition.

In the meantime, the ICO has released two new reports this week, which aim to support businesses in embedding a privacy by design approach when adopting biometrics.

Protecting biometric data is particularly important because of its unchanging nature. As the ICO puts it: ‘Biometric data is unique to an individual and is difficult or impossible to change should it ever be lost, stolen or inappropriately used.’

ICO remains cautious on biometrics

The latest note of warning is a continuation of the ICO’s existing stance on biometrics. The Office has long held concerns around the technology, especially the use of facial recognition. Last year, then-Commissioner Elizabeth Denham said she was “deeply concerned” around the use of live facial recognition, and that such mass data collection could have “significant” consequences.

The ICO has also spoken out against facial recognition use by the police, in schools and in supermarkets, and this year fined Clearview AI £7.5 million for collecting and using UK residents’ image data.

This content was originally published here.