AI voice-generating software is being used by scammers to mimic loved ones and scam vulnerable people out of thousands of dollars, according to The Washington Post. Some software requires just a few sentences of audio to convincingly produce speech that conveys the sound and emotional tone of a speaker’s voice. Impostor scams are extremely common in the US, with more than 5,000 victims losing $11m to phone scams in 2022, the Federal Trade Commission said. The difficulty of tracing calls and identifying scammers, as well as a lack of jurisdictional clarity, make it hard for authorities to crack down on scammers.