CyberArk warns of high-quality deepfake technology enhancing AI voice cloning scams

As AI voice cloning scams rise, experts highlight the alarming ease of creating convincing deepfakes, with just three seconds of audio needed. This growing threat raises urgent concerns about digital identity protection and consumer trust.

Sources:
The Indian ExpressSolutionsreview+1
Updated 1h ago
Tab background
Sources: The Indian Express
CyberArk warns that the rapid advancement of deepfake technology is significantly enhancing AI voice cloning scams, allowing fraudsters to manipulate victims emotionally and convincingly impersonate individuals.

According to David Higgins, senior director at CyberArk, “Deepfake technology has advanced at such a striking pace in recent years, largely due to the breakthroughs in generative AI and machine learning.”

The technology requires only three seconds of audio to clone a voice, making it alarmingly accessible. A report from AIPRM indicates that 70% of adults lack confidence in distinguishing between real and cloned voices, highlighting the growing threat.

Higgins emphasizes that “the most immediate threat to society is that high-quality deepfake technology is widely available, enabling fraudsters and organized crime groups to enhance their scam tactics.”

As businesses grapple with these challenges, the urgency for improved digital identity protection is underscored by findings from Jumio, which states that trust in digital life is eroding due to deepfakes and cyber-crime.

The implications of these scams extend beyond individual victims, threatening the very concept of truth in society.

“Dismissing deepfakes as exaggerated or irrelevant underestimates one of the most disruptive threats faced today,” Higgins warns, urging vigilance against this evolving menace.
Sources: The Indian Express
CyberArk has raised alarms about the rise of high-quality deepfake technology that enhances AI voice cloning scams, which can replicate a person's voice using just three seconds of audio. This alarming trend is contributing to a growing lack of confidence among adults in identifying cloned voices.
Section 1 background
The most immediate threat to society is that high-quality deepfake technology is widely available, enabling fraudsters and organized crime groups to enhance their scam tactics.
David Higgins
Senior Director at CyberArk
1
Key Facts
  • AI voice cloning scams are using emotional manipulation to trick people into sending money.The Indian Express
  • Scammers need just three seconds of audio to clone a person’s voice and use it for a scam call.The Indian Express
  • According to AIPRM, AI voice cloning was among the fastest-growing scams of 2024, with 70% of adults unsure if they could identify a cloned voice.The Indian Express
  • David Higgins warns that high-quality deepfake technology is widely available, enhancing scam tactics.1
Key Stats at a Glance
Percentage of adults unsure about identifying cloned voices
70%
The Indian Express
Time required to clone a voice
3 seconds
The Indian Express
Article not found
Home

Source Citations