General Mark Brnovich Facial recognition

State Attorney General Mark Brnovich wants the Legislature to enact tough safeguards against facial recognition software abuse.

A new federal court ruling — coupled with a provision in the state constitution — could give Arizonans new legal protections against the use of software by private firms that captures and stores facial images.

The ruling came in a lawsuit that some Illinois residents filed against Facebook for invasion of privacy.

They claim the company’s practice of scanning uploaded photos to match against those already in its database violates that state’s laws against the collection of anyone’s biometric information by a private company without written consent.

The 9th Circuit Court of Appeals earlier this month rejected a bid by Facebook to have the case thrown out.

In a sometimes strongly worded opinion, the judges said there is reason to believe that such practices are an invasion of privacy rights and that such an invasion can be considered a harm victims can litigate.

Arizona does not have a similar law.

But Attorney General Mark Brnovich pointed out that Arizona has a specific right to privacy built into the state constitution. 

And if that isn’t enough, Brnovich said state lawmakers should take action to enact a specific statute spelling out what private companies can — and cannot — do with someone’s biometric information, similar to what exists in Illinois.

“I don’t think it’s too much to ask that people respect our privacy,’’ he said.

House Speaker Rusty Bowers, R-Mesa, actually tried to do that earlier this year with legislation to restrict putting biometric information into a database for commercial purposes and generally prohibiting that information from being sold, leased or disclosed for commercial purposes without the individual’s consent.

HB 2478 cleared the House Technology Committee without dissent. 

But a spokesman for Bowers said he yanked the measure from consideration before it got to the House floor “to give stakeholders more time to improve it.’’

Brnovich told Capitol Media Services much is at stake.

“We’re talking about facial recognition, voice recognition, the way you walk, your mannerisms, maybe when it starts coming down to issues like DNA and blood information,’’ he said. “And that’s the kind of stuff that, if it’s compromised or stolen, you can never get back.’’

For example, Brnovich said, if credit card information is stolen, the user can cancel the card and get a new one.

“But if someone steals the information on my voice or voice identity, my facial patterns and stuff, that’s something that I can’t change,’’ he explained. “And that’s something that’s lost forever.’’

That’s exactly the logic used by Judge Sandra Ikuta in writing the unanimous opinion for the 9th Circuit in allowing the lawsuit against Facebook to proceed.

“Biometric data are biologically unique to the individual,’’ Ikuta wrote. “Once compromised, the individual has no recourse, is at a heightened risk for identify theft and is likely to withdraw from biometric-facilitated transactions.’’

Brnovich said the possible harms go far beyond that.

He said that once someone has digitized a person’s face, voice and mannerisms, it’s a small step to use artificial intelligence to create an image that mimics someone’s behaviors and patterns.

“There’s something really creepy about that,’’ he said.

According to court records, the specific issue here involves Facebook practice to analyze uploaded pictures to see if they contain faces.

If so, Ikuta said the technology extracts various geometric data points that make a face unique, like the distance between the eyes, nose and ears to create a face signature or map.

 Then the technology compares that to other faces in its database of face templates to see if there is a match, at which point Facebook may suggest “tagging’’ the person in the photo.

Ikuta said that process creates privacy concerns.

“Once a face template of an individual is created, Facebook can use it to identify that individual in any of the other hundreds of millions of photos uploaded to Facebook each day, as well as determine when the individual was present as a specific location,’’ she wrote. “Facebook can also identify the individual’s Facebook friends or acquaintances who were present in the photo.’’

And it’s not just what can happen now she said, given how technology is developing.

“It seems likely that a face-mapped individual could be identified from a surveillance photo taken on the streets or in an office building,’’ Ikuta said.

“Or a biometric face template could be used to unlock the face recognition lock on the individual’s cellphone,’’ she continued, adding:

 “We conclude that the development of a face template using facial-recognition technology without consent (as alleged here) invades an individuals’ private affairs and concrete interests.’’

And that kind of conduct, Ikuta said, is grounds for litigation.

A spokesman for Facebook told Capitol Media Services the company plans to appeal the 9th Circuit decision.

“We have always disclosed our use of face recognition technology and that people can turn it on or off at any time,’’ the spokesman said.

Brnovich said, though, that an issue in this kind of cases is how easy it is to opt out.

In fact, he wrote to Facebook last year complaining that it took 21 different clicks and screens for someone to be able to opt out of the company’s data collection policies. The company subsequently agreed to make some changes.

That, however, still leaves the question of what rights Arizonans already have to sue over their images being collected, digitized and stored.

“I have always believed that because we have that right to privacy that provides us more protection than the Fourth Amendment does,’’ Brnovich said, with the latter covering “unreasonable search and seizures’’ and requiring government agents to first obtain a warrant.

Still, he conceded, it remains unsettled law to exactly how broad is that right to privacy — especially when it is being invaded not by a government agency but by private corporations. 

“One of the things that we have recently seen is government working with Big Tech and internet service providers to get information that affects individual rights,’’ Brnovich said. “So, we’re starting to see that line blur a little bit more and more when government is using Big Tech and internet service providers to pretty much do its bidding.’’

If nothing else, Brnovich said there needs to be a clear state law about how private companies can use information, particularly if they are making money selling it to others.

“Well, if that’s the case, I should have some sort of property right,’’ he said.

“So, if companies want to buy it, collect it, trade it, sell it, whatever they’re going to do with it, then I should know about it and maybe, ultimately, be provided some sort of compensation for it,’’ Brnovich said.

(0) comments

Welcome to the discussion.

Keep it Clean. Please avoid obscene, vulgar, lewd, racist or sexually-oriented language.
Don't Threaten. Threats of harming another person will not be tolerated.
Be Truthful. Don't knowingly lie about anyone or anything.
Be Nice. No racism, sexism or any sort of -ism that is degrading to another person.
Be Proactive. Use the 'Report' link on each comment to let us know of abusive posts.
Share with Us. We'd love to hear eyewitness accounts, the history behind an article.