U.S. Senator Falls Victim to ‘Deepfake’ Impersonation by Ukrainian Official
U.S. Senator Ben Cardin was reportedly targeted by a “deepfake” caller, who posed as a top Ukrainian official. According to Punchbowl, on Monday (Sept. 23), the Senate’s security office informed a select group of leadership aides and security chiefs from various Senate committees about an incident that occurred earlier in the month on the video conference platform Zoom.
A notice revealed that an individual had been impersonating the recently former Ukrainian Foreign Minister Dmytro Kuleba. During the call, the fake caller posed questions to the Foreign Relations Committee Chair that participants found unusual, including: “Do you support long-range missiles into Russian territory? I need to know your answer,” as stated in the notice sent to senior Senate aides. The impersonator also asked “politically charged questions in relation to the upcoming election.”
The Democratic representative from Maryland issued a statement on Wednesday (Sept. 25) following the incident, saying: “In recent days, a malign actor engaged in a deceptive attempt to have a conversation with me by posing as a known individual. After immediately becoming clear that the individual I was engaging with was not who they claimed to be, I ended the call and my office took swift action, alerting the relevant authorities.”
He added: “This matter is now in the hands of law enforcement, and a comprehensive investigation is underway.”
The FBI is investigating the matter, according to the sources briefed on the investigation.
U.S. faces growing threat of foreign actors using deepfakes
It’s not the first time senators and politicians have been targeted using deepfakes and cyberattacks during the U.S. presidential election campaign trail. ReadWrite reported in August that Iran was accused of organizing cyberattacks on Donald Trump and Kamala Harris’ campaigns. Hackers from APT42, linked to Iran’s IRGC, targeted high-profile U.S. political figures. However, Iran has denied the allegations, calling them “unsubstantiated” and lacking evidence.
Meanwhile, deepfakes and other forms of AI-generated content are reportedly being deployed by Beijing to meddle in the affairs of the United States and Taiwan, with research finding specific examples of manipulated imagery being pushed to fuel conspiracy theories that the U.S. government intentionally caused a train derailment in Kentucky and wildfires in Maui, Hawaii in 2023.
Deepfake video technology uses artificial intelligence to create videos of fictitious people who look and sound real. The technology has sometimes been used to impersonate public figures, including robocalls to New Hampshire voters that pretended to be from President Joe Biden in January.
Featured image: Senate Democrats / / Ideogram
The post Top U.S. Senator targeted by ‘deepfake’ caller posing as Ukrainian official appeared first on ReadWrite.