Imposter scams have been around for years, such as ones involving callers claiming a grandchild has been in an accident or robbed—and needs money. In those cases, would-be kidnappers pose as the grandchild or use generic recordings of someone screaming in the background. These attempts to extort money weren’t always successful, but federal officials are now warning about a new virtual kidnapping fraud that uses artificial intelligence (AI) to clone a loved one’s voice. AI programs are inexpensive, easily accessible, and can create good voice likenesses from just a few seconds of dialogue taken from social media posts. The FBI reports that most scam calls involving AI originate from Mexico and target Latin communities in the southwestern U.S. These sophisticated ruses can be successful, with fake kidnappers stealing an average of $11,000 from each victim. To avoid getting scammed, families are advised not to mention upcoming trips on social media or to give financial information to strangers on the phone. They also should create a family password or phrase that can help identify whether a kidnapping is legitimate.