recommended reading

Is Big Brother Watching Our Campuses?

Maksim Kabakou/Shutterstock.com

At Georgia State University, algorithms alert advisers when a student falls behind in class. Course-planning tools tell students the classes and majors they're likely to complete, based on the performance of other students like them. When students swipe their ID cards to attend a tutoring or financial-literacy session, the university can send attendance data to advisers and staff.

Colleges are analyzing all kinds of student data to figure out who needs extra support and when advisers and faculty should intervene. But as technology advances, and students' offline and online lives become more intertwined, data analytics—particularly, predictive analytics—may raise more ethical questions.

Georgia State started using predictive analytics in 2012. It worked with the Education Advisory Board, a consulting company, to analyze millions of past course grades and create algorithms that identify signs of academic struggle—from the obvious, such as failing a class, to the not-so-obvious, such as barely passing a core class required for a major. The system can predict students' likelihood of success in any major.

The university has since added financial-aid data (and will add card swipe data) to the predictive model to create a more comprehensive assessment of every student's progress. All the data work has helped the university target outreach to more than 32,000 students, many of whom are low-income, African-American, or Hispanic.

Such data analysis is legal and performed with students' interests in mind. But it still raises privacy and ethics concerns, say Joel Reidenberg, founding academic director of the Center on Law and Information Policy at Fordham University.

Even when colleges are collecting aggregate data and scrubbing it of personally identifiable information, if they use it to guide individuals, that's surveillance, he says.

"You have to do the data-mining to be able to profile the individual. And you're taking action based on the data-mined profile," he says.

Should students be told that universities are mining their data?

"We have thought about that," says Timothy Renick, vice president for enrollment management and student success at Georgia State. He says that if a student were to complain, the university would stop tracking them. So far, the university's received nothing but positive feedback from students.

It's now routine for websites of all kinds to customize experiences based on data analytics. Young people have different expectations of privacy; after all, they grew up sharing personal information on social media.

"What's happening now with the university's interaction with them is not that different from what's been happening on Facebook, and other places—Amazon, and so forth," Renick says.

The average college has nothing close to the analytic capacity of Facebook or Amazon. But, in theory, colleges could data-mine almost every aspect of a student's life. Institutions can track what students say in online course discussion forums, who downloads the lecture notes, and how long he or she spent reviewing online material. Institutions can track ID card swipes to record students' physical movements, from the dining hall to the health center.

Institutions can build or purchase software, like the Knewton platform, that analyzes students' every keystroke to figure out their learning style.

"The NSA has nothing on the ed tech start-up known as Knewton," Politico wrote earlier this year. Some of the data learning applications collect doesn't fall under the federal government's definition of 'educational record,' and thus doesn't fall under laws that restrict the kind of data colleges can and cannot share with third parties.

Some experts are starting to question how universal data collection could affect the educational experience. Matt Pittinsky is a data guy: He cofounded Blackboard, a learning management system, serves as the chief executive officer of Parchment, an education technology company, and teaches sociology at Arizona State University.

Right now, most colleges analyze too little data, Pittinsky says, and fail to address completion and quality issues as a result. But he's sometimes troubled by what he hears at education-technology conferences. Last year, he challenged a fellow panelist: "I just sort of stopped and said, I think you're describing a state of education where every interaction a learner is having with a faculty member and with each other [online] is tracked and used to form judgments about them, to form judgments about people like them, to form judgments about the next group of people like them."

"There's something worth talking about in that," he says. Pittinsky worries that the back and forth of classroom inquiry might be stifled if faculty and students knew every keystroke they made, even every word they spoke, was being recorded and used to make predictions about them.

Predictive tools could convince educators and students that the academic future is predetermined, in the same way that placing students into "honors" or "regular" grade school classes can end up defining them.

"What begins as the notion of pacing education to each learner's abilities at the time can very quickly become a solidified view of what someone is able to do and what someone is not able to do, with very heavy-handed direction given to them about what they then have access to," Pittinsky says.

Universities should be able to navigate privacy and ethical issues: They are, after all, packed with people who conduct research and ponder big questions for a living. With well-trained advisers and well-designed tools, predictive analytics needn't pigeon-hole students into one major over another.

At the heart of the debate over predictive technology are two competing visions for a college education. Should college be a period when students can find their passion, make mistakes and learn from them? Or does that approach doom some students—particularly, underrepresented students—to failure?

"What we were doing in the past wasn't working," Renick says. Under Georgia State's former, less proactive advising system, students whose parents had gone to college were fine, but first-generation students floundered. "They left with high amounts of debt and no degree. That was not an acceptable program," Renick says.

(Image via Maksim Kabakou/ Shutterstock.com)

Threatwatch Alert

Misplaced data

More Than 30 Million South Africans’ Personal Info Published to Public Internet

See threatwatch report

JOIN THE DISCUSSION

Close [ x ] More from Nextgov