Home > Technology peripherals > AI > Scammers use AI voices to impersonate loved ones to steal millions

Scammers use AI voices to impersonate loved ones to steal millions

王林
Release: 2023-05-02 22:10:05
forward
834 people have browsed it

诈骗者利用 AI 声音冒充亲人窃取数百万美元

In 2022, more than 5,000 victims were scammed out of money over the phone.

  • Artificial intelligence voice generation software allows scammers to imitate the voices of loved ones.
  • These impersonations led to people being scammed out of $11 million over the phone in 2022.
  • Elderly people make up the majority of the target population.

Artificial intelligence has been a central topic in the tech world for some time now, as Microsoft continues to inject ChatGPT into its products and Google attempts to keep up with the trend by launching its own artificial intelligence products. While AI has the potential to do some truly impressive things — like generating an image from a line of text — we’re starting to see more downsides to a technology that’s largely unregulated. The latest example is artificial intelligence voice generators being used to defraud people out of their money.

Artificial intelligence speech generation software has been in the headlines lately, mostly for stealing the voices of voice actors. Initially, the software only needed a few words to convincingly recreate the speaker's voice and tone. The technology has advanced to the point where just a few seconds of conversation is enough to accurately imitate someone.

诈骗者利用 AI 声音冒充亲人窃取数百万美元

In a new report from The Washington Post, thousands of victims claim they were targeted by imposters posing as loved ones deceived. Imposter scams have reportedly become the second most popular type of fraud in the United States, with more than 36,000 cases filed in 2022. According to FTC officials, of the 36,000 cases, more than 5,000 victims had their money scammed over the phone, with losses totaling $11 million.

One story that stands out involves an elderly couple who They believed they had spoken to their son and sent more than $15,000 to the scammers through a Bitcoin terminal. An artificial intelligence voice led the couple to believe their son was in legal trouble after killing an American diplomat in a car accident.

Like the victims in the story, the attacks appear to be mostly targeting the elderly. This is not surprising, as seniors are one of the most vulnerable groups when it comes to financial fraud. Unfortunately, courts have yet to decide whether companies should be held liable for damages caused by AI voice generators or other forms of AI technology.

The above is the detailed content of Scammers use AI voices to impersonate loved ones to steal millions. For more information, please follow other related articles on the PHP Chinese website!

Related labels:
source:51cto.com
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template