Registration is wrong! Please check form fields.
Publication detail:

Addressing Legal, Ethical, and Technical Issues Associated with Voice Biometrics: Choosing the Privacy-First Approach

05.jpgIn the previous article, we answered the questions about voice biometrics that normally appear before and during the start of implementation of such systems. There are some more strategic issues at CEO and CIO level, nevertheless, that are indirectly linked to the brand and reputation management issues and deserve serious attention. Let’s crack on with the next three:

1. Does the shift to voice biometrics create threats to commonly accepted notions of privacy and security?

The short answer is – no. Voice biometrics helps ensure cost-effective and secure remote access to services, preventing unauthorised access to personal data, financial resources, information, etc. Voiceprints can be created using audio recordings of customers’ conversations and, generally speaking, the same rules of privacy and security apply for storing and using both voiceprints and audio recordings. Recording customer conversations is allowed, and it is recognised as good practice to inform customers about it. In some jurisdictions, it is necessary to provide explicit information that conversations are being recorded.

The conceptions of data privacy and security and their reflections in public consciousness remain quite technical and bureaucratic language-wise, but one thing is clear for most people: the use of biometrics makes it harder for criminals of all kinds to succeed with their fraudulent activities.

For others, the main concern is that the use of biometrics will be imposed on them by the giant companies using security concerns as a pretext. From the legal point of view, customers’ consent is required at all times in the EU or for companies that work with the EU-based customers for the lawful processing of personal data as per GDPR. Although the voiceprint itself is a mathematical model of a person’s voice and does not contain any personal data that falls under the GDPR definition, voiceprints are normally linked to the personal data of customers that organisations collect and process (like names, online or other identifiers, etc.) and, in such cases, should be treated in accordance with GDPR regulations. In other countries outside the EU, similar regulations either exist already or are bound to appear soon thanks to the sheer logic of economic interconnectedness.

To summarise: it’s not about the voice biometrics as such, but rather the intentions and ethics of largely unaccountable corporations that may not necessarily fit in with people’s expectations.

2. Are there limitations on transferring voiceprints data across borders, or storing biometric samples outside the country?

When a company decides to use voice biometrics and create customers’ voiceprints, it is necessary to inform customers in a transparent way and to receive their free, prior and informed consent.

That consent can be given by the customer in any legally appropriate form, e.g. by signing under the statement of consent in the customer’s Terms and Conditions section, by clicking on the “accept” button under the same form on the website, or by making a confirmation by voice during a recorded telephone conversation with the call centre agent.

Customers must also be offered the possibility to “opt-out”, if they decide not to use biometry or to delete any previously created voiceprint– and it must be possible to use all the services without it. Customers should be made aware of the purpose of data collection and processing.

Centralised storage of voiceprints and other biometric samples should be justified and securely protected with controlled and regulated access. In case of cross border data transfer, it is necessary to ensure that the privacy of the data is not endangered. It is important to note that transfers of biometric data may involve risks related to the protection of rights and freedoms. Most jurisdictions in the EU provide the adequate protection; however, some EU countries do not guarantee protection to the personal data of legal entities. Outside of the EU, many jurisdictions do not provide a sufficient level of protection.

Data transfer agreements in Switzerland are derived from the EU model for personal data transfer. It is necessary to refer to the list of countries providing sufficient level of protection, and seek legal advice, where required. It is also essential to ensure compliance with the EU GDPR in case the voiceprints linked to personal data are being transferred.

The safest option, of course, is to keep customers’ personal data in-country with the certified service provider or on-premise, as Spitch normally advises its clients.

3. Can a customer/employee request that their voice data should not be used for authentication? Can they request the deletion of their data?

Yes, they have the right to do that and it must be respected. As mentioned above, customers should have the possibility to “opt-out” if they do not want to be authenticated by voice. Deletion is also allowed and in some cases prescribed when collection and processing of the personal data has been unlawful, and when individuals are claiming their right in certain circumstances to have inaccurate personal data rectified, blocked or deleted (e.g. because they are no longer the customers of the company where their personal data have been kept, as per GDPR). Customer consent is required when dealing with the personal data of EU citizens, when voice data is transferred to a third party.

Publications