Categories
Integrated Solutions Offering

Parents warned of disturbing kidnapping scheme using kids’ voice replicas [Video]

Join Fox News for access to this content

Plus special access to select articles and other premium content with your account – free of charge.

Please enter a valid email address.

Phone scams have been around for a while, but recent advancements in artificial intelligence technology is making it easier for bad actors to convince people they have kidnapped their loved ones.

Scammers are using AI to replicate the voices of people’s family members in fake kidnapping schemes that convince victims to send money to the scammers in exchange for their loved ones’ safety.

The scheme recently targeted two victims in a Washington state school district.

Highline Public Schools in Burien, Washington, issued a Sept. 25 notice alerting community members that the two individuals were targeted by “scammers falsely claiming they kidnapped a family member.”

CAN AI HELP SOMEONE STAGE A FAKE KIDNAPPING SCAM AGAINST YOU OR YOUR FAMILY?

Scammers are using AI to manipulate people’s real voices in an effort to trick victims into thinking their …

Watch/Read More