PHOENIX β Unable to stop βdeep fakesββ in politics, state lawmakers are settling on the next best thing: requiring that viewers and listeners be told what they are seeing is not real.
Legislation awaiting a final Senate vote spells out that anyone who distributes a βsynthetic media messageββ purporting to show a candidate within 90 days of an election must include a βclear and conspicuous disclosureββ that the media includes content generated by artificial intelligence.
That message also would need to remain on the screen for the duration of the video.
If itβs an audio message, there would have to be something read. And the proposal by Sen. Frank Carroll, R-Sun City West, is designed to ensure this isnβt delivered by some fast-talking announcer, with the requirement that it be βin a clearly spoken manner and in a pitch that can be easily heard by the average listenerββ β and at the beginning, the end and, if the audio is longer, then every two minutes throughout.
And the measure carries fines for first-time offenders β and prison time for those who repeat.
Whatβs driving the bill is the fact that itβs already happening.
Some of that has been in the criminal sphere, like someone faking the voice of a child to get a parent to pay ransom.
Thatβs being addressed by Sen. John Kavanagh, R-Fountain Hills, with his SB 1078.
It would make it a felony to use a computer generated voice recording, image or video of another person with intent to defraud or harass others. That measure, too, awaits a final Senate vote.
SB 1359 takes a narrower approach.
βIn this case here, weβre talking about election integrity,ββ said Carroll. βIf someone can be given the appearance as a candidate in office, and they can make statements that are not the statements of the candidate, present themselves otherwise, that is a problem right there.ββ
It has happened already in New Hampshire where a recorded message was sent to voters ahead of the January primary with someone who sounds like Joe Biden telling listeners not to vote but βsave your vote for the November election.ββ
The regulatory approach appears to be designed to avoid running into First Amendment problems.
In 2012 the U.S. Supreme Court overturned the federal Stolen Valor Act which made it a crime to make false statements about having military honors. The court concluded that the fact that the statement was false did not remove its constitutional protections.
What that leaves is disclosure.
βI just have to reiterate and overemphasize that this is about election integrity,ββ he told colleagues. βYou can thwart an election by an impostor speaking as if theyβre the actual candidate.ββ
SB 1359 has a combination of disclosure and criminal penalties.
Creators would be subject to a Class 1 misdemeanor β something that can carry up to six months in county jail but generally results only in a fine β for failing to make the proper disclosure. But a second conviction within five years would be a Class 4 felony which carries a presumptive sentence of 2.5 years in state prison.
That bothered Sen. Priya Sundareshan.
βThe bill is needed,ββ the Tucson Democrat said. βThis is a scary time for anyone who is a candidate and who could potentially be out there and perceived to be saying things that they didnβt say.ββ
But Sundareshan said she cannot support the Class 4 felony.
βOf course, I want there to be compliance with the law,ββ she said. βI want to figure out how do we achieve that.ββ
Sundareshan, however, called the felony βkind of extreme.ββ
Carroll disagreed, saying the penalty for a repeat offender is merited.
βThis is serious,ββ he said. βThe impact of this, if it is successful, could change the outcome of an election in a most awful way.ββ
Sundareshan had another concern.
As originally crafted, SB 1359 and its disclosure requirements and penalties applied not just to the person or entity that created it but also to any other entity involved in its distribution. Carroll had that language removed when the measure was debated in the Senate.
The question, he said, is having to prove whether someone who posts a deep fake actually bears some liability or simply put it out there unaware of its falsity. Carroll said he wants the focus to be on who created the deep fake and who paid for it.
βAnd thatβs what the target should be,ββ he said.
Sundareshan said that makes sense. But she said there may be situations where itβs difficult to figure out who is the original creator.
βMaybe itβs coming from Russia,ββ Sundareshan said. What the bill does in limiting liability, she said, is allow the deep fake to continue to be used by others βeven though the people who are using it might very well know itβs a fake.ββ
And what that would do, said Sundareshan, is simply let people continue to use those fakes by saying they werenβt the ones who created it.
Carroll said something thatβs created overseas may be beyond the reach of Arizona law and be an issue of foreign relations.
βAnd thatβs a whole area unto itself,ββ he said.
βYour point about foreign actors is well taken,ββ Carroll told Sundareshan. βHopefully we donβt have to go into war over it.ββ
Thereβs actually another measure making its way through the Legislature to deal with deep fakes in political situations, this one without any criminal penalties.
HB 2394 would allow a candidate to go to court and get a quick ruling that what they are seeing or hearing isnβt really them.
The proposal by Rep. Alexander Kolodin, R-Scottsdale would not allow a judge to actually ordera deep fake of a candidate to be removed from wherever it is posted. Nor would there be any sanctions against those who create or post themselves.
But Kolodin said that a court order then would allow the candidate to have a judicial declaration that the item is a fake.
His measure already has been approved unanimously by the House and awaits Senate action.