Government watchdog warns unemployment facial recognition system has 'racial and gender' bias
The federal watchdog cited a report showing black Pandemic Unemployment Assistance program applicants in two states received payments at about half the rate of white applicants.
The Labor Department's inspector general is warning that facial recognition technology used by state unemployment agencies to verify identity has a "racial and gender bias" and could discriminate against claimants.
The federal watchdog said in a memo last week cited "urgent equity and security concerns" about the technology because unemployment "programs have become a target for fraud with significant numbers of imposter claims being filed with stolen or synthetic identities" following the COVID-19 pandemic.
Four states from March 2020 through September 2020 paid $9.9 billion in likely fraudulent claims. In response, 24 of 53 state workforce agencies hired 10 identity-verification service contractors that used facial recognition.
The inspector general said it is "concerned that the use of identity verification service contractors may not result in equitable and secure access."
In one federal report cited by the watchdog, black Pandemic Unemployment Assistance program applicants in two states received payments at about half the rate of white applicants.
The inspector general issued several recommendations to the acting Assistant Secretary of Employment and Training, including to direct state agencies to provide alternatives to facial recognition and to require facial recognition services to test for biases.
The assistant secretary agreed with the recommendations.