A State Becomes the First to Suspend Facial Recognition Technology in Schools

america_stock/Shutterstock.com

New York Gov. Andrew Cuomo said there are “serious and legitimate privacy concerns” with the technology—but only prohibited its use until 2022.

New York banned the use of facial recognition software in schools this week, becoming the first state to officially pause deploying the technology in educational settings to consider privacy concerns.

In signing the legislation, Gov. Andrew Cuomo said that while the technology “could provide a host of benefits to New Yorkers … its use brings up serious and legitimate privacy concerns that we have to examine." 

The new law doesn’t create a permanent ban on the use of facial recognition, but prohibits its use in public or private schools until at least July 1, 2022. After that point, the state education commissioner could approve its use following a study on the use of biometric identifying technology. The state Office of Information Technology and Education Department are directed to collect feedback from teachers, parents, and experts in data privacy and school safety issues.

"The safety and security of our children is vital to every parent, and whether to use this technology is not a decision to be made lightly,” Cuomo said. 

More than 100 civil rights, racial justice, and data privacy organizations expressed their support for the legislation.

In the past few years, cities like San Francisco, Oakland, Boston, and Portland have banned government use of facial recognition technology, though this legislation is often focused on law enforcement. Massachusetts this month almost became the first state to stop law enforcement use of facial recognition, but Gov. Charlie Baker sent the bill back to the legislature this week, asking them to loosen the restrictions.

The law in New York was a response to one school district’s decision to use facial recognition technology in its K-12 schools.

In June, the ACLU of New York sued the state education department over the installation of a facial recognition system in Lockport City Schools, a district near Niagara Falls. The state’s approval of this technology showed a “dangerous lack of oversight and an alarming misunderstanding of the way it analyzes student data,” the civil liberties group said. The implementation of the system violated students’ rights to privacy, the lawsuit argued, because the district wanted to sync their data to local, state, and federal crime databases.

School officials defended the system, saying it would help identify school shooters as they enter the building, allowing shootings to be prevented. During that kind of situation, “that time—minutes, seconds—can save lives,” the district’s technology director said over the summer. 

The Lockport system would have used real-time collection and analysis of biometric information from students, including those as young as five. The system also recognized “flagged faces” and guns, which the district said would protect students from intruders, sex offenders, and shootings. But Black parents who were plaintiffs in the ACLU’s suit said the district ignored evidence that facial recognition software is racially biased against people of color.

A growing body of research has shown that this software has much higher error rates identifying and distinguishing between Black faces. Young Black women, in particular, are misidentified at a rate far surpassing other racial, age, and gender groups. Several research institutions recommend a complete ban on the use of the technology in schools.

The debate over the use of facial recognition in schools often picks up after school shootings, when proponents of such systems, like in Lockport, argue that they could be used to save lives. A few months after school shootings in Parkland, Florida and Santa Fe, New Mexico, one facial recognition software developer began offering its technologies to schools for free. 

Some policy experts argue that facial recognition won’t actually stop school shootings because most shooters are students who have no prior records and therefore wouldn’t be flagged by systems.

Still, Lockport administrators said they were “profoundly disappointed” by the bill’s passage in July.

State Sen. Brian Kavanagh, one of the bill’s sponsors alongside Assemblymember Monica Wallace, said “it makes no sense to bring this aggressive surveillance technology into our schools when no one has made a compelling case, either that it will meaningfully improve security or that it can be used without violating the privacy and civil rights of students, staff, and visitors.”

“This law will ensure that state education officials review this technology and vouch for it before any young people are subjected to it,” he continued. “I expect that they will conclude that it is neither necessary nor appropriate in schools.”