Honolulu Star-Advertiser

Sunday, November 3, 2024 76° Today's Paper


Consultant fined $6M for AI-generated Biden robocalls

REUTERS/ANDREW KELLY/FILE PHOTO
                                Signage is seen at the headquarters of the Federal Communications Commission in Washington, D.C., in August 2020. The Federal Communications Commission today finalized a $6 million fine for a political consultant over fake robocalls that mimicked President Joe Biden’s voice, urging New Hampshire voters not to vote in that state’s Democratic primary.

REUTERS/ANDREW KELLY/FILE PHOTO

Signage is seen at the headquarters of the Federal Communications Commission in Washington, D.C., in August 2020. The Federal Communications Commission today finalized a $6 million fine for a political consultant over fake robocalls that mimicked President Joe Biden’s voice, urging New Hampshire voters not to vote in that state’s Democratic primary.

WASHINGTON >> The Federal Communications Commission today finalized a $6 million fine for a political consultant over fake robocalls that mimicked President Joe Biden’s voice, urging New Hampshire voters not to vote in that state’s Democratic primary.

In May, Steven Kramer, a Louisiana Democratic political consultant, was indicted in New Hampshire over calls that appeared to have Biden asking residents not to vote until November. Kramer had worked for Biden’s primary challenger, Representative Dean Phillips, who denounced the calls.

In January, Kramer told media outlets he paid $500 to have the calls sent to voters to raise attention to the danger of artificial intelligence in campaigns.

The FCC said the calls were generated using an AI-generated deepfake audio recording meant to sound like Biden’s voice.

FCC rules prohibit the transmission of inaccurate caller ID information. The commission said Kramer will be required to pay the fine within 30 days or the matter will be referred to the Justice Department for collection.

Kramer or a spokesperson could not immediately be reached.

“It is now cheap and easy to use Artificial Intelligence to clone voices and flood us with fake sounds and images,” FCC Chair Jessica Rosenworcel said. “By unlawfully appropriating the likeness of someone we know, this technology can illegally interfere with elections. We need to call it out when we see it and use every tool at our disposal to stop this fraud.”

In August, Lingo Telecom agreed to pay a $1 million fine after the FCC said it transmitted the New Hampshire fake robocalls.

The FCC said Lingo under the settlement will implement a compliance plan requiring strict adherence to FCC caller ID authentication rules.

The commission in July voted to propose requiring broadcast radio and television political advertisements to disclose whether content is generated by AI. That proposal is still pending.

By participating in online discussions you acknowledge that you have agreed to the Terms of Service. An insightful discussion of ideas and viewpoints is encouraged, but comments must be civil and in good taste, with no personal attacks. If your comments are inappropriate, you may be banned from posting. Report comments if you believe they do not follow our guidelines. Having trouble with comments? Learn more here.