Quick Hits
*** A study released last week by the National Institute of Standards and Technology demonstrated that most facial recognition software systems do a poor job of making correct matches between images of individuals of African and Asian descent. The disparity in false positives between different groups is characterized as "large," In addition to racial effects, most algorithms generate more false positive matches among women than among men.
In a Dec. 20 letter to Acting Homeland Security Secretary Chad Wolf, Rep. Bennie Thompson (D-Miss.) called the results "shocking." Thompson, who chairs the House Homeland Security Committee, said the results, "call into question not only DHS's future plans for expanding the use of facial recognition technology, but also the Department’s current operations."
*** A group of House Democrats led by Rep. Mark Takano (D-Calif.), the chairman of the Veterans Affairs Committee, is calling on the VA to rethink their approach to implementing a group of workforce executive orders outside the collective bargaining process.
"Failure to work effectively with organized labor and protect the rights of workers is a failure to adequately fulfill the mission of VA," the lawmakers stated in a Dec. 19 letter.




