The company said it hoped the moratorium “might give Congress enough time to put in place appropriate rules” for the technology.
SEATTLE — Amazon said on Wednesday that it was putting a one-year pause on letting the police use its facial recognition tool, in a major sign of the growing concerns that the technology may lead to unfair treatment of African-Americans.
The technology giant did not explain its reasoning in its brief blog post about the change, but the move came amid the nationwide protests over racism and biased policing. Amazon’s technology had been criticized in the past for misidentifying people of color.
The announcement was a striking change for Amazon, a prominent supplier of facial recognition software to law enforcement. More than other big technology companies, Amazon has resisted calls to slow its deployment. In the past, Amazon had said its tools were accurate but were improperly used by researchers.
On Monday, IBM said it would stop selling facial recognition products, and last year, the leading maker of police body cameras banned the use of facial recognition on its products at the recommendation of its independent ethics board, which said the technology “is not currently reliable enough to ethically justify its use.” Google has advocated a temporary ban on the technology.
The American Civil Liberties Union applauded Amazon in a statement for “finally recognizing the dangers face recognition poses to Black and Brown communities and civil rights more broadly.” But it said that the company should extend the moratorium on law enforcement use of its system until Congress passed a law regulating the technology.
“Face recognition technology gives governments the unprecedented power to spy on us wherever we go,” Nicole Ozer, technology and civil liberties director for the A.C.L.U. of Northern California, said in the statement. “It fuels police abuse. This surveillance technology must be stopped.”
Law enforcement agencies use facial recognition technology to identify suspects and missing children. The systems work by trying to match facial pattern data extracted from photos or video with those in databases like driver’s license records. The authorities used the technology to help identify the suspect in the mass shooting at a newspaper last year in Annapolis, Md.
But civil liberties groups have warned that the technology can be used at a distance to secretly identify individuals — such as protesters attending demonstrations — potentially chilling Americans’ right to free speech or simply limiting their ability to go about their business anonymously in public. Some cities, including San Francisco, and Cambridge, Mass., have passed bans on the technology.
This week, Democrats in the House introduced a police reform law that would ban the use of facial recognition technology with police recording equipment. Some lawmakers have long worried about the technology, questioning manufacturers and the public agencies that use their products on how it affects civil rights and privacy.
Civil liberties advocates began a campaign to ban the use of facial recognition by law enforcement in 2018, after a report by academic researchers found racial bias in the systems. The report found that facial technologies made by IBM and Microsoft were able to correctly identify the gender of white men in photographs about 100 percent of the time. But the systems were much less accurate in their ability to identify the gender of darker-skinned women.
IBM and Microsoft quickly improved their systems. Amazon found itself under heightened scrutiny.
For the past two years, the A.C.L.U. has led a campaign to push Amazon to stop selling the technology to law enforcement agencies. The group obtained documents, using open information laws, from police departments that showed how Amazon was aggressively marketing its technology to law enforcement.
The A.C.L.U. also tested Amazon’s technology using the head shots of members of Congress and comparing them against a database of publicly available mug shots. The group reported that the Amazon technology incorrectly matched 28 members of Congress with people who had been arrested, amounting to a 5 percent error rate among legislators. At the time, Amazon disputed the findings, saying that the group had used its system differently than law enforcement customers did.
Rep. Jimmy Gomez, a California Democrat and one of the lawmakers misidentified in the A.C.L.U. test, said he met with Amazon about the issue almost a dozen times. He said Amazon was less open to criticism than its tech peers.
“They were avoiding taking any responsibility for their technology in my opinion,” Mr. Gomez said on Wednesday after the company’s announcement. “They always had some excuse.”
Mr. Gomez, who is vice chairman of the House Committee on Oversight and Reform, said he was glad to see Amazon halt police sales.
“Amazon can sense that the American people don’t want platitudes when it comes to dealing with disparities right now,” he said. “They want concrete action.”
Amazon introduced Rekognition in 2016 as a low-cost, “highly scalable” way to identify images, including people, in vast databases. Soon after, it began pitching the police on the tool to help investigations, and law enforcement agencies began adopting the technology.
In an interview on the PBS show “Frontline” earlier this year, Andy Jassy, the chief executive of Amazon Web Services, said he did not think the company knew how many police departments were deploying the technology.
Last fall, Jeff Bezos, Amazon’s chief executive, said the company was drafting privacy legislation for facial recognition. But he indicated that Amazon would continue selling the tools in the meantime.
“It’s a perfect example of something that has really positive uses, so you don’t want to put the brakes on it,” Mr. Bezos said. “At the same time, there is lots of potential for abuses with that kind of technology, so you want regulations.”
He said he would welcome “good regulations” on the issue. “That kind of stability I think would be healthy for the whole industry,” he said.
Mr. Bezos did not provide details for what the company’s proposed legislation would entail.
Mr. Gomez said he had not seen any model legislation proposed by Amazon, adding, “That would have been news to me.”
Karen Weise reported from Seattle, and Natasha Singer from New York. David McCabe contributed reporting from Washington.