Skip to content

Numerous individuals might fall prey to deceptive AI-generated voice mimicking cons perpetrated by this financial institution.

Numerous individuals might become targets of deception, as scammers employ artificial intelligence to replicate their voices, as conceptualized by a British banking institution.

Fraudsters can allegedly employ AI to mimic an individual's voice, producing convincing imitations...
Fraudsters can allegedly employ AI to mimic an individual's voice, producing convincing imitations even from mere three-second audio clips obtained from sources like videos shared online, according to Starling Bank's statement.

Numerous individuals might fall prey to deceptive AI-generated voice mimicking cons perpetrated by this financial institution.

Digital banking service provider Starling Bank warns about the rising threat of AI-powered voice cloning scams. Fraudsters can create convincing voice replicas using just three seconds of audio, often obtained from public videos. These AI-generated voices can then be used to impersonate the person and contact their friends or family, requesting money under the pretext of an emergency.

Starling Bank estimates that these scams could trap millions of unsuspecting individuals. Based on a survey conducted with Mortar Research involving over 3,000 adults, more than a quarter have already fallen victim to such AI voice-cloning scams within the last year.

The survey also revealed that 46% of respondents were unaware of the existence of such scams, and a concerning 8% would send the full amount requested by a supposed friend or family member, even if the call seemed suspicious.

Lisa Grahame, Starling Bank's chief information security officer, highlighted the issue in a press release, stating, "People often share content online with voice recordings, unknowingly making themselves more vulnerable to fraudsters."

Starling Bank's advice is to establish a "secure phrase" with loved ones – a simple, easy-to-remember, and distinct phrase from other passwords that can serve as a verification tool in phone calls. However, this secure phrase should not be shared via text message, as this could make it easier for scammers to intercept. If shared via text, the message should be deleted once the recipient has seen it.

As AI voice mimicry becomes more sophisticated, concerns are growing about the potential harm it could cause, such as aiding criminals in accessing bank accounts and spreading misinformation.

Earlier this year, OpenAI, the creator of the popular generative AI chatbot ChatGPT, developed a voice replication tool, Voice Engine. Despite its potential for misuse, OpenAI chose not to make the tool available to the public at the time, citing the "risk of synthetic voice misuse."

The tech industry's advancements, such as OpenAI's Voice Engine, have raised concerns in the business world about the increasing sophistication of AI voice mimicry. Starling Bank, recognizing this, advises its customers to use a secure business practice of establishing unique phrases with loved ones to verify identity during phone calls.

Read also:

Comments

Latest

Grave accusations levied against JVA staff members in Bavaria

Grave accusations levied against JVA staff members in Bavaria

Grave accusations levied against JVA staff members in Bavaria The Augsburg District Attorney's Office is currently investigating several staff members of the Augsburg-Gablingen prison (JVA) on allegations of severe prisoner mistreatment. The focus of the investigation is on claims of bodily harm in the workplace. It's

Members Public