PHOENIX — A new federal court ruling coupled with a provision in the state constitution could give Arizonans new legal protections against the use of software by private firms that captures and stores facial images.
The lawsuit involves claims of invasion of privacy brought against Facebook by some Illinois residents. They claim the company’s practice of scanning uploaded photos to match against those already in its database violates that state’s laws against the collection of anyone’s biometric information by a private company without informing the person and getting a written release.
The 9th U.S. Circuit Court of Appeals earlier this month rejected a bid by Facebook to have the case thrown out.
In a sometimes strongly worded opinion, the judges said there is reason to believe that such practices are an invasion of privacy rights. More to the point, the court concluded that such an invasion can be considered a harm that victims can litigate.
Arizona does not have a similar law.
But Attorney General Mark Brnovich said Arizona does have something else: a specific right to privacy built in to the Arizona Constitution. And if that isn’t enough, Brnovich said state lawmakers should take action to enact a specific statute spelling out what private companies can and cannot do with someone’s biometric information, similar to what exists in Illinois.
“I don’t think it’s too much to ask that people respect our privacy,” he said.
House Speaker Russell Bowers, R-Mesa, actually tried to do that earlier this year with legislation to restrict putting biometric information into a database for commercial purposes and generally prohibit that information from being sold, leased or disclosed for commercial purposes without the individual’s consent.
HB 2478 cleared the House Technology Committee without dissent. But a spokesman for Bowers said he yanked the measure from consideration before it got to the House floor “to give stakeholders more time to improve it.”
Brnovich, in an extensive interview with Capitol Media Services, said it’s important to realize what’s at stake.
“We’re talking about facial recognition, voice recognition, the way you walk, your mannerisms, maybe when it starts coming down to issues like DNA and blood information,” he said. “And that’s the kind of stuff that, if it’s compromised or stolen, you can never get back.”
For example, Brnovich said, if credit card information is stolen, the user can cancel the card and get a new one.
“But if someone steals the information on my voice or voice identity, my facial patterns and stuff, that’s something that I can’t change,” he explained. “And that’s something that’s lost forever.”
That’s exactly the logic used by Judge Sandra Ikuta in writing the unanimous opinion for the 9th Circuit — the federal appeals court whose rulings govern 10 western states including Arizona — in allowing the lawsuit against Facebook to proceed.
In her example, Ikuta talked about a Social Security number being compromised by hackers.
In that case, she said, someone can get a new number.
“Biometric data are biologically unique to the individual,” Ikuta wrote. “Once compromised, the individual has no recourse, is at a heightened risk for identify theft, and is likely to withdraw from biometric-facilitated transactions.”
Brnovich said the possible harms go far beyond that, saying that once someone has digitized a person’s face, voice and mannerisms, it’s a small step to use artificial intelligence to create an image that mimics someone’s behaviors and patterns.
“There’s something really creepy about that,” he said.
According to court records, the specific issue here involves Facebook practice to analyze uploaded pictures to see if they contain faces.
If so, Ikuta said the technology extracts various geometric data points that make a face unique, like the distance between the eyes, nose and ears to create a face signature or map. Then the technology compares that to other faces in its database of face templates to see if there is a match, at which point Facebook may suggest “tagging” the person in the photo.
Ikuta said that process creates privacy concerns.
“Once a face template of an individual is created, Facebook can use it to identify that individual in any of the other hundreds of millions of photos uploaded to Facebook each day, as well as determine when the individual was present as a specific location,” she wrote. “Facebook can also identify the individual’s Facebook friends or acquaintances who were present in the photo.”
And it’s not just what can happen now she said, given how technology is developing.
“It seems likely that a face-mapped individual could be identified from a surveillance photo taken on the streets or in an office building,” Ikuta said.
“Or a biometric face template could be used to unlock the face recognition lock on the individual’s cellphone,” she continued. “We conclude that the development of a face template using facial-recognition technology without consent (as alleged here) invades an individuals’ private affairs and concrete interests.”
And that kind of conduct, Ikuta said, is grounds for litigation.
A spokesman for Facebook told Capitol Media Services the company plans to appeal the 9th Circuit decision allowing the lawsuit to go forward. “We have always disclosed our use of face recognition technology and that people can turn it on or off at any time,” the spokesman said.
Brnovich said, though, that an issue in these kind of cases is how easy — or hard — it is to opt out.
In fact, he wrote to Facebook last year complaining that it took 21 different clicks and screens for someone to be able to opt out of the company’s data collection policies. The company subsequently agreed to make some changes. That, however, still leaves the question of what rights Arizonans already have to sue over their images being collected, digitized and stored.
It starts with the Arizona Constitution Article 2, Section 8: “No person shall be disturbed in his private affairs, or his home invaded, without authority of law.”
“I have always believed that because we have that right to privacy that provides us more protection than the Fourth Amendment does,” Brnovich said, with the latter covering “unreasonable search and seizures” and requiring government agents to first obtain a warrant.
Still, he conceded, it remains unsettled law to exactly how broad is that right to privacy — especially when it is being invaded not by a government agency but by private corporations.
And complicating matters, he said, is that the lines are not clear.
“One of the things that we have recently seen is government working with Big Tech and internet service providers to get information that affects individual rights,” Brnovich said. “So we’re starting to see that line blur a little bit more and more when government is using Big Tech and internet service providers to pretty much do its bidding.”
If nothing else, Brnovich said there needs to be a clear state law about how private companies can use information, particularly if they are making money selling it to others.