IBM has made a bold call to no longer offer general-purpose facial recognition or analysis software. The decision was published in a letter penned by CEO Arvind Krishna on June 8, and is part of a pledge to work with Congress “in pursuit of justice and racial equity.”
According to Krishna, IBM initially plans to focus its efforts on three policy areas: police reform, responsible use of technology, and broadening skills and educational opportunities. The letter addressed to Congress outlines policy proposals with a view to addressing responsible use of technology in the context of law enforcement.
“IBM firmly opposes and will not condone uses of any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and freedoms, or any purpose which is not consistent with our values and Principles of Trust and Transparency,” noted the letter.
Krishna further articulated IBM’s stance on artificial intelligence, noting that vendors and users of Al systems “have a shared responsibility to ensure that Al is tested for bias, particularity when used in law enforcement, and that such bias testing is audited and reported.”
Facial recognition software is an ongoing subject of scrutiny and has been largely unregulated. Krishna’s letter called for national policy to “encourage and advance uses of technology that bring greater transparency and accountability to policing, such as body cameras and modern data analytics techniques.”
In a 2018 study, MIT researchers showed that machine learning algorithms can discriminate based on classes like race and gender. The paper notes that many AI systems, such as face recognition tools, rely on machine learning algorithms that are trained with biased data that have resulted in algorithmic discrimination.
In a Microsoft blog published Dec. 6, 2018, the company noted that despite many positive benefits, certain uses of facial recognition technology increase the risk of biased decisions, outcomes and experiences, with higher error rates when seeking to determine the gender of women and people of color.
In combating bias, intrusion on privacy and the potential encroachment on democratic freedoms, Microsoft called for these problems to be addressed through legislation. The company has also undertaken steps through research and policy updates to help engineers identify blind spots.