Report Cites Vulnerabilities in Voice Authentication (Sept. 29, 2015)
Biometrics authentication is widely viewed as one of the most promising methods of enhancing payments security—but biometrics authentication isn’t foolproof, a new study suggests. Researchers at the University of Alabama at Birmingham (UAB) found that voice-based user authentication systems are vulnerable to “voice impersonation attacks,” where criminals gain access to a sample of a user’s voice and use that sample to build a model of the user’s speech pattern with voice-morphing software. From there, fraudsters can use the model to say virtually anything in the victim’s voice, enabling fraudsters to penetrate voice verification systems, the study found. Voice authentication algorithms caught just 10 to 20 percent of the test attacks the researchers launched using an off-the-shelf voice-morphing tool.
Much of the threat of voice imitation lies in just how easy it is to obtain a voice sample, the researchers noted. “Because people rely on the use of their voices all the time, it becomes a comfortable practice,” said Nitesh Saxena, associate professor of computer and information sciences at UAB. “What they may not realize is that level of comfort lends itself to making the voice a vulnerable commodity. People often leave traces of their voices in many different scenarios. They may talk out loud while socializing in restaurants, giving public presentations or making phone calls, or leave voice samples online.” Given this ubiquity, criminals can easily snatch a voice sample by recording a person in public, making a spam phone call or even searching for videos the target has posted online.
To defend against voice imitation attacks, the researchers recommend developing systems that can detect the live presence of the speaker—not just the pattern of his voice. The team plans to conduct future research into that and other defense strategies.
In a Paybefore blog, Enacomm CEO Michael Boukadakis contends that voice biometrics authentication is even more reliable than fingerprinting, with a 99.99 percent success rate. “Even voice recordings or ‘replays’ cannot be used to gain unauthorized account access,” he writes. “Voice authentication applications can ask for words, numbers and phrases in random order, making recordings or stolen voiceprints useless.”
Related stories: