The University of Arizona Tracked Students’ ID Card Swipes to Predict Who Would Drop Out

We may earn a commission from links on this page.

At the University of Arizona, researchers tracked the swipes of its freshman students’ ID cards to predict which students were most likely to drop out. The university sees this surveillance of student behavior as a way to lower its dropout rate.

The researchers collected data through CatCard student IDs, which are given to every enrolled student at the University of Arizona and can be used at hundreds of locations, including residence halls, labs, and the Student Union Memorial Center, which includes a hair salon, movie theater, convenience store, post office, and more, according to a university press release.

“It’s kind of like a sensor that’s embedded in them, which can be used for tracking them,” said Sudha Ram, a professor of management information systems at the University of Arizona. “It’s really not designed to track their social interactions, but you can, because you have a timestamp and location information.”


As part of a separate but related research effort, university researchers have already collected freshman data over a three-year span and analyzed students’ time at the school “based on some 800 other data points, including academic performance, financial aid, student activity in the UA’s course management system, and a number of other factors,” a UA spokesperson told Gizmodo in an email. The school also intends to provide advisers with an online dashboard where they can look at the student data in real time.


“As early as the first day of classes, even for freshmen, these predictive analytics are creating highly accurate indicators that inform what we do to support students in our programs and practice,” said UA’s assistant provost for institutional research, Angela Baldasare.

The university creates lists six times a year with the names of students identified as most likely to drop out, Baldasare said. These lists are then shared with colleges so that advisers can intervene. According to Baldasare, the predictions for which students are most likely to leave the university are accurate about 73 percent of the time from day one of classes.


“We think by doing these interventions by the 12th week, which is when students make up their mind, you’re sort of doing what Amazon does—delivering items you didn’t order but will be ordering in the future,” Ram said.

But there is an important distinction to make here. By choosing to purchase goods on Amazon, a customer is willingly and knowingly sharing data with the company. However, it’s not clear if this is the case for University of Arizona freshmen. The CatCard policy site, for example, does not disclose how the university is monitoring student behavior through their swipes and payments. We have reached out to Ram for comment on whether students are made aware that their CatCard data is being collected for research, and whether they have the option to opt out.


“We live in an era where you shouldn’t be generalizing about ‘groups of people,” Ram said. “You should be personalizing solutions at the individual level.” According to the university press release, the data gathered is “merely a signal.” Ram added:

“It’s ultimately up to the advisers on the ground to use that information to diagnose the problem and help students as best they can, with the understanding that it never will be possible to retain everyone.”


One might argue that the monitoring of an entire student body lacks the nuance to effectively deal with an individual on a personal level. And, as we’ve seen, algorithms aren’t free from bias, so there are consequences from leaning heavily on predictive analytics to solve a social issue.

Correction: A previous version of this article incorrectly implied that lists of students likely to drop out were based on Sudha Ram’s CatCard data research. According to a UA spokesperson, these lists are instead created based on some 800 other data points. The lists are also produced six times each year, not four. We regret the errors.