Artificial intelligence-enabled voice cloning tools have made it easier for criminals to mimic strangers’ voices and dupe victims into handing over large sums of money.
For example, a scammer might target a victim posing as their grandchild and claiming they require cash — fast. Older people who might not be as familiar with new technologies such as AI can be particularly susceptible to these types of scams, particularly when the caller on the other line sounds identical to a loved one. Phone numbers also can be spoofed to mimic those of callers known to the target of voice cloning scams.
In 2023, senior citizens were conned out of roughly $3.4 billion in a range of financial crimes, according to the FBI data. The agency recently warnedthat AI has increased the “believability” or criminal scams given that they “assist with content …