Schools Are Using AI to Track What Students Write on Their Computers

Rawpixel.com/Shutterstock.com

This software is raising ethical concerns about the line between safety and privacy. 

Over 50 million k-12 students will go back to school in the U.S. this month. For many of them using a school computer, every word they type will be tracked.

Under the Children’s Internet Protection Act, any school that receives federal funding is required to have an internet-safety policy. As school-issued tablets and Chromebook laptops become more commonplace, schools must install technological guardrails to keep their students safe. For some, this simply means blocking inappropriate websites. Others, however, have turned to software companies like GaggleSecurly, and GoGuardian to surface potentially worrisome communications to school administrators.

These Safety Management Platforms use natural-language processing to scan through the millions of words typed on school computers. If a word or phrase might indicate bullying or self-harm behavior, it gets surfaced for a team of humans to review.

In an age of mass school-shootings and increased student suicides, SMPs can play a vital role in preventing harm before it happens. Each of these companies has case studies where an intercepted message helped save lives. But the software also raises ethical concerns about the line between protecting students’ safety and protecting their privacy. 

“A good-faith effort to monitor students keeps raising the bar until you have a sort of surveillance state in the classroom,” Girard Kelly, the director of privacy review at Common Sense Media, a non-profit that promotes internet-safety education for children, told Quartz. “Not only are there metal detectors and cameras in the schools, but now their learning objectives and emails are being tracked too.”

The debate around SMPs sits at the intersection of two topics of national interest—protecting schools and protecting data. As more and more schools go one-to-one, the industry term for assigning every student a device of their own, the need to protect students’ digital lives is only going to increase. Over 50% of teachers say their schools are one-to-one, according to a 2017 survey from Freckle Education, meaning there’s a huge market to tap into.

But even in an age of student suicides and school shootings, when do security precautions start to infringe on students’ freedoms?

Safety, managed

The most popular SMPs all work slightly differently. Gaggle, which charges roughly $5 per student annually, is a filter on top of popular tools like Google Docs and Gmail. When the Gaggle algorithm surfaces a word or phrase that may be of concern—like a mention of drugs or signs of cyberbullying—the “incident” gets sent to human reviewers before being passed on to the school. Securly goes one step beyond classroom tools and gives schools the option to perform sentiment analysis on students’ public social media posts. Using AI, the software is able to process thousands of student tweets, posts, and status updates to look for signs of harm. 

Kelly thinks SMPs help normalize surveillance from a young age. In the wake of the Cambridge Analytica scandal at Facebook and other recent data breaches from companies like Equifax, we have the opportunity to teach kids the importance of protecting their online data, he said.

“There should be a whole gradation of how this [software] should work,” Daphne Keller, the director of the Stanford Center for Internet and Society (and mother of two), told Quartz. “We should be able to choose something in between, that is a good balance [between safety and surveillance], rather than forcing kids to divulge all their data without any control.”

To be sure, in an age of increased school violence, bullying, and depression, schools have an obligation to protect their students. But the protection of kids’ personal information is also a matter of their safety. Securly CEO Vinay Mahadik agrees that privacy is an important concern, but believes companies like his can strike the right balance of freedom and supervision.

“Not everybody is happy because we are talking about monitoring kids,” Mahadik told Quartz. “But as a whole, everyone agrees there has to be a solution for keeping them safe. That’s the fine line we’re walking.”

Informed Consent

Critics like Keller believe digital surveillance might have a chilling effect on students’ freedom of expression. If students know they’re being monitored, they might censor themselves from speaking their mind. This would, of course, only occur if the students knew they were being watched.

Though most school districts require parents to sign blanket consent agreements to use technology in the classroom, some districts believe they’ll get a more representative picture of behavior if students aren’t aware of the software, according to Patterson. In other words, some districts don’t let the kids know they’re being tracked. 

“Parental consent can be a get-out-of-jail-free card for vendors,” Bill Fitzgerald, a long-time school technology director, who now consults schools and non-profits on privacy issues, told Quartz. “When a parent consents to terms [to a variety of edtech tools] at the beginning of the school year, that’s all the third-party really needs to operate.” 

SMPs market to parents’ and school districts’ biggest fears. “This might be the only insight adults get to a student’s suffering,” Securly’s website says, before quoting a director of IT in Michigan public schools: “Just one avoidance of a young person harming themselves or others would be worth a thousand times the subscription price.” 

Gaggle has gone even further. Not only do SMPs let schools monitor students, but the same software can be used to surveil teachers, it suggests. “Think about the recent teacher work stoppage in West Virginia,” a recent blog post reads. “Could the story have been different if school leaders there requested search results for ‘health insurance’ or ‘strike’ months earlier? Occasional searches for ‘salary’ or ‘layoffs’ could stave off staff concerns that lead to adverse press for your school district.”

(The company has since taken the post down. In an email, Patterson told Quartz that it was not in line with Gaggle’s mission “to ensure the safety and well being of students and schools.”)  

Avoiding bad press and preventing teacher strikes have little to do with keeping students safe, but the implied message from the post is clear: Gaggle’s clients are administrators, not the students or teachers. 

The concern, however, is that students’ protection is coming at the expense of their privacy. As kids spend more of their formative years online, they also need safe digital spaces to explore their own identities.

“Suppose you are a kid considering suicide and you want to write a diary about it or talk to your friend about the feelings that you’re having, but you don’t because you’re afraid you’ll be turned into your parent,” Keller said. “I’m not sure that’s a good outcome.” 

When we start monitoring kids’ behavior from a young age, Keller believes it can set a dangerous precedent. As adults reckon with issues of privacy and data protection, she believes kids must also learn what it means to give companies access to their personal information.

“I’m worried about how clearly my kid knows what he’s agreed to when receiving that district provided device,” Liz Kline, a California parent, told Quartz. “It’s fine now when he’s six, but what about when he’s in high school and wants to organize a walk out?”