5.1 Million UK Taxpayers Voice Profiles Collected

Privacy campaigners ‘Big Brother Watch’ have recently brought to light an issue whereby voice profile data of over 5.1 million tax payers has been collected without the consent of the UK public.

The Information Commissioner’s Office (ICO) are currently investigating complaints about the service, their responses received so far include that the data has been held securely, they also highlighted that callers could opt to avoid using voice ID.

How does voice ID work?

Voice ID requests callers to speak the following phrase “my voice is my password”. Once the caller has completed this they can use the phrase for future authentication with the service.

What is the purpose of the service?

HMRC say the reasoning is to speed up security measures when a caller utilises the voice service

However Big Brother Watch claimed that taxpayers were being “railroaded into a mass ID scheme”, as they were not given the choice to opt out.

“These voice IDs could allow ordinary citizens to be identified by government agencies across other areas of their private lives,” said Silkie Carlo, director of Big Brother Watch.

She went on to demand HMRC to “delete the five million voiceprints they’ve taken in this shady scheme”.

HMRC responded saying that the technology was very popular with customers

Are HMRC breaking any regulations?

The General Data Protection Regulation (GDPR) came into effect across the European Union last month, GDPR ensures organisations MUST obtain explicit consent before using data to identify someone, this includes voice recordings.

It would seem that HMRC have not obtained explicit consent when they responded to the freedom of information request with the following statement:

HMRC currently operates VoiceID on the basis of the implied consent of the customer, but is developing a new process which will be operated on the basis of the explicit consent of the customer.

Are there security risks with HMRC using this technology?

LHN were provided with an exclusive statement from Tom Harwood, CPO and Co-Founder at Aeriandi

“Biometrics technology has been shown to significantly reduce fraud, especially in the financial sector – but it’s not the whole solution.  Last year, two twins demonstrated how easy it is to trick these systems, after they gained access to HSBC’s voice biometrics security platform.
No security technology is 100% fool-proof, and it’s is now possible to cheat voice recognition systems.  Voice synthesiser technology is a great example.  It makes it possible to take an audio recording and alter it to include words and phrases the original speaker never spoke, thus making voice biometric authentication insecure.
Organisations need additional technologies – beyond biometrics – to protect their customers.  Fraud detection technology is the prime candidate.  It looks at far more than the voiceprint of the user; it considers hundreds of other parameters to ensure the caller and the call is legitimate – everything from their location to the acoustic dimensions of the room they’re making the call from.”

Related posts

Apple Addressed Two Zero-Day Flaws In Intel-based Macs

Really Simple Security Plugin Flaw Risks 4+ Million WordPress Websites

Glove Stealer Emerges A New Malware Threat For Browsers