PHOENIX โ Unable to stop โdeep fakesโโ in politics, state lawmakers are settling on the next best thing: requiring that viewers and listeners be told what they are seeing is not real.
Legislation awaiting a final Senate vote spells out that anyone who distributes a โsynthetic media messageโโ purporting to show a candidate within 90 days of an election must include a โclear and conspicuous disclosureโโ that the media includes content generated by artificial intelligence.
That message also would need to remain on the screen for the duration of the video.
If itโs an audio message, there would have to be something read. And the proposal by Sen. Frank Carroll, R-Sun City West, is designed to ensure this isnโt delivered by some fast-talking announcer, with the requirement that it be โin a clearly spoken manner and in a pitch that can be easily heard by the average listenerโโ โ and at the beginning, the end and, if the audio is longer, then every two minutes throughout.
And the measure carries fines for first-time offenders โ and prison time for those who repeat.
Whatโs driving the bill is the fact that itโs already happening.
Some of that has been in the criminal sphere, like someone faking the voice of a child to get a parent to pay ransom.
Thatโs being addressed by Sen. John Kavanagh, R-Fountain Hills, with his SB 1078.
It would make it a felony to use a computer generated voice recording, image or video of another person with intent to defraud or harass others. That measure, too, awaits a final Senate vote.
SB 1359 takes a narrower approach.
โIn this case here, weโre talking about election integrity,โโ said Carroll. โIf someone can be given the appearance as a candidate in office, and they can make statements that are not the statements of the candidate, present themselves otherwise, that is a problem right there.โโ
It has happened already in New Hampshire where a recorded message was sent to voters ahead of the January primary with someone who sounds like Joe Biden telling listeners not to vote but โsave your vote for the November election.โโ
The regulatory approach appears to be designed to avoid running into First Amendment problems.
In 2012 the U.S. Supreme Court overturned the federal Stolen Valor Act which made it a crime to make false statements about having military honors. The court concluded that the fact that the statement was false did not remove its constitutional protections.
What that leaves is disclosure.
โI just have to reiterate and overemphasize that this is about election integrity,โโ he told colleagues. โYou can thwart an election by an impostor speaking as if theyโre the actual candidate.โโ
SB 1359 has a combination of disclosure and criminal penalties.
Creators would be subject to a Class 1 misdemeanor โ something that can carry up to six months in county jail but generally results only in a fine โ for failing to make the proper disclosure. But a second conviction within five years would be a Class 4 felony which carries a presumptive sentence of 2.5 years in state prison.
That bothered Sen. Priya Sundareshan.
โThe bill is needed,โโ the Tucson Democrat said. โThis is a scary time for anyone who is a candidate and who could potentially be out there and perceived to be saying things that they didnโt say.โโ
But Sundareshan said she cannot support the Class 4 felony.
โOf course, I want there to be compliance with the law,โโ she said. โI want to figure out how do we achieve that.โโ
Sundareshan, however, called the felony โkind of extreme.โโ
Carroll disagreed, saying the penalty for a repeat offender is merited.
โThis is serious,โโ he said. โThe impact of this, if it is successful, could change the outcome of an election in a most awful way.โโ
Sundareshan had another concern.
As originally crafted, SB 1359 and its disclosure requirements and penalties applied not just to the person or entity that created it but also to any other entity involved in its distribution. Carroll had that language removed when the measure was debated in the Senate.
The question, he said, is having to prove whether someone who posts a deep fake actually bears some liability or simply put it out there unaware of its falsity. Carroll said he wants the focus to be on who created the deep fake and who paid for it.
โAnd thatโs what the target should be,โโ he said.
Sundareshan said that makes sense. But she said there may be situations where itโs difficult to figure out who is the original creator.
โMaybe itโs coming from Russia,โโ Sundareshan said. What the bill does in limiting liability, she said, is allow the deep fake to continue to be used by others โeven though the people who are using it might very well know itโs a fake.โโ
And what that would do, said Sundareshan, is simply let people continue to use those fakes by saying they werenโt the ones who created it.
Carroll said something thatโs created overseas may be beyond the reach of Arizona law and be an issue of foreign relations.
โAnd thatโs a whole area unto itself,โโ he said.
โYour point about foreign actors is well taken,โโ Carroll told Sundareshan. โHopefully we donโt have to go into war over it.โโ
Thereโs actually another measure making its way through the Legislature to deal with deep fakes in political situations, this one without any criminal penalties.
HB 2394 would allow a candidate to go to court and get a quick ruling that what they are seeing or hearing isnโt really them.
The proposal by Rep. Alexander Kolodin, R-Scottsdale would not allow a judge to actually ordera deep fake of a candidate to be removed from wherever it is posted. Nor would there be any sanctions against those who create or post themselves.
But Kolodin said that a court order then would allow the candidate to have a judicial declaration that the item is a fake.
His measure already has been approved unanimously by the House and awaits Senate action.