There seemed nothing wrong with Rep. Zack Stephenson’s testimony before a House committee on why the Minnesota Legislature should crack down on the use of so-called deepfake technology.
Using artificial intelligence, the technology can manipulate audio and video to create lifelike recreations of people saying or doing things that never actually happened, he said. Deepfakes have been used to create sexually explicit videos of people or fake political content aimed at influencing elections.
Then the Coon Rapids Democrat paused for a revelation. His comments up to this point were written by artificial intelligence software ChatGPT using a one-sentence prompt. He wanted to demonstrate “the advanced nature of artificial technology.”
It worked.
“Thank you for that disturbing statement,” replied DFL-Golden Valley Rep. Mike Freiberg, chairman of the Minnesota House Election Committee.
The proposal represents a first attempt by the Minnesota legislature to curb the spread of disinformation through technology, particularly when it comes to interfering with elections or in situations where it’s used to spread fake sexual images of someone without the person’s consent.
In Minnesota, it is already a crime to post, sell, or distribute private explicit images and videos without the person’s permission. But this revenge porn law was written before much was known about deepfake technology, which was already being used in Minnesota to disseminate realistic — but not real — sexual images of people.
Stephenson’s bill would make it a gross misdemeanor to knowingly distribute sexually explicit content using deepfake technology that uniquely identifies an individual without permission.
The proposal aligns with what California and Texas have already done to curb the use of the technology in these situations, as well as when deep fakes are used to influence an election. His bill would penalize anyone who uses the technology up to 60 days before an election to try to hurt a candidate’s credibility or otherwise influence voters.
Deepfake video has yet to become a major problem in federal or state elections, but national groups have sounded the alarm that the technology has evolved rapidly in recent years.
Republicans on the committee supported the idea, concerned about the possibility of the technology creating fake information that could quickly spread on social media.
“I had seen a video, artificial intelligence, where they literally heard someone’s voice for a short time, 20 or 30 seconds, and then they were able to duplicate that voice,” said Rep. Pam Altendorf, R- Redwing. “At the same time it was totally fascinating to me, it was also totally terrifying to think about what people could do with it.”
The finance and politics committee of the house election presented the proposal for a ballot last week.
“Hopefully we can resolve this before Skynet becomes self-aware,” Freiberg said after the vote.