May 15, 2025 -Politics & Policy
Scams use AI to mimic senior officials' voices, FBI warns

The J. Edgar Hoover Building, the FBI headquarters, in Washington. Photo: Kent Nishimura/Getty Images
Scammers are using artificial intelligence to impersonate senior U.S. officials, the FBI warned Thursday.
Why it matters: The impersonations show how increasingly sophisticated scammers are becoming about using artificial intelligence to exploit their targets.
- Many of the targets have been current or former government officials and their contacts, the FBI said.
- "The malicious actors have sent text messages and AI-generated voice messages — techniques known as smishing and vishing, respectively — that claim to come from a senior U.S. official in an effort to establish rapport before gaining access to personal accounts," the FBI alert said.
- The FBI did not immediately respond to Axios' request for more information.
Context: With seconds of audio, artificial intelligence can mimic a voice that is virtually indistinguishable from the original to the human ear.
- Scammers have weaponized voice cloning tech, and many products lack significant safeguards to prevent fraud or misuse.
Our thought bubble, from Axios' Ina Fried: It's another sign that voice cloning has become trivially easy, and that the era of deep fakes is here, not in the future.
- Security systems need to not rely on voice as a means of authentication and all people, whether government officials or grandmothers, need to not assume someone on the other end of a phone call is who they say they are.
State of play: Federal layoffs have created new target opportunities for cybercriminals and nation-state adversaries.
- Russia and China have attempted to recruit disgruntled federal employees, CNN reported in March.
Go deeper: AI voice-cloning scams: A persistent threat with limited guardrails