AI-Generated Robocalls Impersonating President Biden Surface, Sparking Concerns in New Hampshire

TL;DR:

  • New Hampshire AG investigates AI robocalls mimicking President Biden’s voice.
  • Calls aim to discourage primary election voting, labeled as illegal voter suppression.
  • Deceptive messages falsely claim primary voting prevents participation in the general election.
  • The origin of the calls remains unknown, investigations are ongoing.
  • Concerns grow over the use of AI-generated deepfakes in elections globally.
  • Experts emphasize the need for increased regulation and awareness.
  • Political campaigns distance themselves from such tactics, emphasizing the importance of safeguarding democracy.

Main AI News:

The New Hampshire Attorney General’s Office is currently investigating a disturbing incident involving AI-generated robocalls that imitated the voice of President Joe Biden. These automated calls aimed to discourage voters from participating in the state’s primary election, which is scheduled for Tuesday. Attorney General John Formella has labeled this activity as an illegal attempt to disrupt and suppress the voting process, urging voters to completely disregard the message.

The recorded message featured a voice remarkably similar to President Biden’s and employed one of his frequently-used phrases, “What a bunch of malarkey.” The message then advised listeners to “save your vote for the November election,” falsely claiming that voting in the primary would enable the Republicans to re-elect Donald Trump.

It is important to clarify that voting in the primary does not preclude voters from participating in the November general election. While President Biden is not actively campaigning in New Hampshire and is not on the primary ballot, his supporters have initiated a write-in campaign for him.

The origin of these deceptive robocalls remains unknown, as they falsely appeared to come from the personal cellphone number of Kathy Sullivan, a former state Democratic Party chair involved in supporting the Biden write-in campaign. Sullivan has reported the incident to law enforcement and the Attorney General’s Office, characterizing it as an outright attempt at election interference and harassment.

Although the exact number of recipients of these calls is unclear, at least a dozen individuals have confirmed receiving them. The Attorney General’s Office encourages anyone who has received such a call to contact the state Justice Department’s election law unit.

One recipient, Gail Huntley, a 73-year-old Democrat from Hancock, New Hampshire, recognized the voice as Biden’s but initially believed it to be legitimate. She later realized it was a scam, highlighting the convincing nature of the AI-generated fake. This incident underscores the growing concern surrounding the use of such technology to manipulate voters.

The White House press secretary, Karine Jean-Pierre, and Biden’s campaign manager, Julie Chavez Rodriguez, have confirmed that the call was indeed fake and not recorded by the President. They are actively considering further actions to address this issue, emphasizing the importance of combating disinformation and safeguarding democracy.

This incident involving advanced generative AI technology is a troubling example of the challenges posed by election disinformation in 2024, as experts warn of its potential global impact. Generative AI deepfakes have already made appearances in campaign ads during the 2024 presidential race and have been used to spread misinformation in elections worldwide.

Hany Farid, an expert in digital forensics at the University of California, Berkeley, described the call recording as a relatively low-quality AI fake, highlighting the need for increased awareness and regulation. While the federal government grapples with these challenges, there is ongoing debate in Congress and the Federal Election Commission regarding the regulation of AI deepfakes in campaign ads.

David Becker, a former U.S. Department of Justice attorney and election law expert, expressed concerns about the intent behind such calls, suggesting that they aim to undermine trust in the democratic process. Regardless of their specific goals, these incidents underscore the importance of preserving the integrity of elections and combating misinformation.

Katie Dolan, a spokeswoman for Rep. Dean Phillips of Minnesota, emphasized that discouraging voters is unacceptable and an affront to democracy. The potential use of AI to manipulate voters is a deeply troubling development that requires vigilant attention and action.

Conclusion:

The emergence of AI-generated robocalls impersonating political figures, as seen in the case of President Biden, highlights a concerning trend in election disinformation. Such deceptive tactics pose a threat to the integrity of democratic processes. To mitigate this risk, increased regulation and public awareness are essential. Political campaigns must distance themselves from these tactics to ensure the preservation of trust and transparency in elections.

Source