Follow CBSMIAMI.COM: Facebook | Twitter

(CNN Money) — US Rep. Jimmy Gomez was not terribly shocked to learn he was among 28 members of Congress matched to criminal mugshots in an experiment by the ACLU of Northern California. In fact, the California Democrat had a pretty good idea what many of the people on that list shared in common.

“I told my staff that I wouldn’t be surprised if it was mostly people of color or minorities,” he told CNNMoney.

He was right.

On Thursday, the ACLU released the results of its test of Amazon’s controversial Rekognition software, which uses artificial intelligence to, among other thing, recognize people.

After building a database of 25,000 publicly available arrest photos and comparing it to all 535 members of Congress, the ACLU found that Rekognition, when left to its default settings, identified lawmakers like Luis Guitierrez of Illinois, John Lewis of Georgia, and Norma Torres of California, as criminals.

Of the 28 lawmakers Rekognition misidentified, about 40% were people of color, a result that led the ACLU to call for a government moratorium on the use of facial recognition software — a call that gained traction Thursday among lawmakers.

“This is a big deal,” Gomez said. “The bias that exists will be digitized and used against people who already have a lot of obstacles and struggles.”

Although inaccuracy in facial recognition among women and people of color is a known issue, experts say it underscores the need for broader conversations about the ethics of such technology and the responsibility of companies creating it to ensure it is used fairly.

To that end, Gomez and Lewis sent Amazon CEO Jeff Bezos a letter requesting “an immediate meeting to discuss how to address the defects of this technology in order to prevent inaccurate outcomes.”

In a separate letter, senators Edward Markey, Mark DeSaulnier, and Luis Gutierrez asked Bezos to provide details on any internal accuracy or bias assessments Amazon has conducted on Rekognition. They also want to know which law enforcement or intelligence agencies use Rekognition, and whether Amazon audits their use of it.

Amazon pushed back against the ACLU’s results by arguing, in effect, that it didn’t use the tool properly.

“We think that the results could probably be improved by following best practices around setting the confidence thresholds (this is the percentage likelihood that Rekognition found a match) used in the test,” an Amazon spokesperson said in a statement. “While 80% confidence is an acceptable threshold for photos of hot dogs, chairs, animals, or other social media use cases, it wouldn’t be appropriate for identifying individuals with a reasonable level of certainty. When using facial recognition for law enforcement activities, we guide customers to set a threshold of at least 95% or higher.”

Suresh Venkatasubramanian, a computer scientist at the University of Utah, said that argument lets Amazon off the hook too easily. The company has an obligation to make sure people know how to use its tools properly, he told CNNMoney.

“You had the choice to not put the tech out if you felt it was sensitive,” he said. “We don’t let people randomly prescribe medicines to themselves.”

Tech companies are coming around to this view. In a blog post published earlier this month, Microsoft President Brad Smith suggested regulating facial recognition technology given its “broad societal ramifications and potential for abuse.”

Bezos has not addressed the issue publicly despite recent calls from shareholders and civil rights groups to stop selling the technology to the government. In the statement to CNNMoney, Amazon called Rekognition “a driver for good in the world.”

Perhaps, or perhaps not, said Woodrow Hartzog, who teaches law and computer science at Northeastern University. “The idea that this is simply neutral technology that can be used for good or evil and Amazon shouldn’t be responsible, I think is purely wrong,” he said.

“It’s not unreasonable to say if you build a product that is capable of harm than you should be responsible for the design choices you make for enabling the harm,” he said, “and when you release it out into the world, you’re doing so in a safe and sustainable way.”

(©2018 Cable News Network, Inc., a Time Warner Company. All rights reserved. By Sara Ashley O’Brien)

Comments

Leave a Reply

Please log in using one of these methods to post your comment:

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s