IBM stops facial recognition and joins the call for police reforms

IBM stops facial recognition and joins the call for police reforms

IBM is discontinuing facial recognition and says it is concerned with how the technology can be used for mass surveillance and racial profiling.

Persistent protests in response to George Floyd’s death have led to a wider settlement of racial injustice and a closer look at the use of police technology to track down protesters and track American neighborhoods.

IBM is one of many large technology companies that had previously attempted to improve the accuracy of their face scanning software after research showed racial and gender differences. But the new CEO is now wondering whether it should be used by the police at all.

“We think now is the time to start a national dialogue on whether and how facial recognition technology should be used by national law enforcement agencies,” CEO Arvind Krishna wrote in a letter sent to US lawmakers on Monday.

IBM’s decision to stop building and selling facial recognition software is unlikely to impact results as the technology giant focuses more and more on cloud computing, while a range of lesser-known companies enter the government facial recognition contract market. has driven closely.

“But the symbolic nature of this is important,” said Mutale Nkonde, a research fellow at Harvard and Stanford universities who leads the nonprofit AI For the People.

Nkonde said that closing a company by IBM “under the guise of progressive anti-racist commercial practices” demonstrates that it can be done and that it is “socially unacceptable for companies tweeting Black Lives Matter to do so under a contract with the police.” .

Krishna’s letter was addressed to a group of Democrats who worked on police reform legislation in Congress, fueled by the mass protests against Floyd’s death. The sweeping reform package could contain restrictions on the use of facial recognition by the police.

The practice of using a form of artificial intelligence to identify individuals in photo databases or video feeds was explored after researchers discovered racial and gender differences in systems built by companies such as IBM, Microsoft and Amazon.

IBM had previously tested its facial recognition software with the New York police, although the department has recently used other vendors. It is not clear whether IBM has existing contracts with other government agencies.

Many U.S. law enforcement agencies rely on facial recognition software built by companies less known to the public, such as the Tokyo-based NEC or European companies Idemia and Cognitec, according to Clare Garvie, a researcher at Georgetown University’s Center on Privacy and Technology.

A smaller number collaborate with Amazon, which has faced the most opposition from privacy advocates since the introduction of the Rekognition software in 2016.

Krishna’s letter called for police reform and noted that “IBM will not tolerate strong opposition to and use of technology, including facial recognition technology offered by other suppliers, for mass surveillance, racial profiling” and human rights abuses.

Civil liberties advocates have expressed concern in recent weeks about using surveillance technology to monitor demonstrators or enforce rules put in place to curb the coronavirus pandemic.

Even before the protests, US senators this year examined the startup Clearview AI in facial recognition in New York after research reports on the practice of harvesting billions of photos from social media and other Internet services to identify people.

Joy Buolamwini, a researcher at the Massachusetts Institute of Technology, whose research on facial recognition bias has helped fuel IBM’s re-examination of the technology, said Tuesday that she praises the congressional police reform package for seeking restrictions on the use of police cameras to scan people’s faces live.

But she said lawmakers can go further to protect people from having governments scan their faces for social media posts or public spaces without their knowledge.

“Regardless of the accuracy of these systems, facial recognition mass surveillance could lead to hair-raising effects and silence dissent,” Buolamwini wrote in an email sent from Boston City Hall, testifying in support of a proposed ban on facial treatment recognized by municipal authorities. San Francisco and several other U.S. cities have introduced similar bans in the past year.

Copyright 2020 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or distributed without permission.