Algorithmic Justice League

Digital advocacy non-profit organization

Follow Algorithmic Justice League on Notably News to receive short updates to your email — rarely!

February 2022 The IRS agreed to halt their facial recognition program after AJL and other organizations sent letters to legislators and requested intervention, marking a significant victory for the organization's advocacy against invasive facial recognition technologies.
January 2022 AJL collaborated with Fight for the Future and the Electronic Privacy Information Center to launch the DumpID.me online petition, calling for the IRS to stop using ID.me facial recognition technology for user logins.
2021 Fast Company recognized the Algorithmic Justice League as one of the 10 most innovative AI companies in the world, highlighting the organization's significant impact in addressing AI bias and promoting ethical AI development.
September 2021 Algorithmic Justice League collaborated with Olay and O'Neil Risk Consulting & Algorithmic Auditing (ORCAA) to conduct the Decode the Bias campaign, auditing the Olay Skin Advisor (OSA) System for bias against women of color.
2020 Featured in the Netflix documentary 'Coded Bias', which premiered at the Sundance Film Festival, highlighting the Algorithmic Justice League's research and advocacy against algorithmic bias in facial recognition systems.
2020 Advocacy efforts by AJL and similar groups led Amazon and IBM to address algorithmic biases and temporarily ban their facial recognition products from police use.
July 2020 The Algorithmic Justice League officially launched the Community Reporting of Algorithmic System Harms (CRASH) Project, which aims to create a framework for bug-bounty programs to uncover and report algorithmic bias in AI technologies.
May 2020 Released a white paper calling for the creation of a new United States federal government office to regulate facial recognition technologies, aiming to reduce risks of mass surveillance and bias towards vulnerable populations.
March 2020 Algorithmic Justice League released 'Voicing Erasure', a spoken word artistic piece highlighting racial bias in automatic speech recognition (ASR) systems. The piece was performed by prominent researchers including Ruha Benjamin, Sasha Costanza-Chock, Safiya Noble, and KimberlĂ© Crenshaw, based on a PNAS research paper documenting racial disparities in commercial ASR system performance.
2019 Joy Buolamwini and digital security researcher Camille François first met at the Bellagio Center Residency Program hosted by The Rockefeller Foundation, which initiated the conceptualization of the CRASH project.
2019 Joy Buolamwini represented the Algorithmic Justice League (AJL) at a US House Committee on Science, Space, and Technology congressional hearing, discussing facial recognition technologies. She testified about the underperformance of these technologies in identifying people with darker skin and feminine features, presenting research from the AJL's 'Gender Shades' project.
2018 Joy Buolamwini and Timnit Gebru published the 'Gender Shades' study, revealing significant racial and gender bias in facial recognition algorithms from Microsoft, IBM, and Face++. The study demonstrated that machine learning models were less accurate when analyzing dark-skinned and feminine faces.
2018 Launched the Safe Face Pledge in collaboration with the Georgetown Center on Privacy & Technology, urging technology organizations and governments to prohibit lethal use of facial recognition technologies.
2016 Joy Buolamwini founded the Algorithmic Justice League (AJL), a digital advocacy non-profit organization based in Cambridge, Massachusetts, dedicated to raising awareness about bias and potential harms in artificial intelligence systems.

This contents of the box above is based on material from the Wikipedia article Algorithmic Justice League, which is released under the Creative Commons Attribution-ShareAlike 4.0 International License.

See Also