Student Testing AI Proctorio Fails When Put to the Test

 

If I asked you to pay me thousands of dollars, hand over your private information, and let me monitor you through your computer’s webcam, all for nothing in return, would you accept? If you’re one of thousands of US universities, you’d answer yes to this question.

The concept of virtual test administration has ballooned in recent years and was exacerbated by the COVID-19 pandemic that forced many types of exams to be taken virtually. One of the vendors cornering a large portion of the online proctoring market is Proctorio. Proctorio was estimated to be in use at 63% of US and Canadian universities during the pandemic, but continued its proliferation even after students moved back to in-person classes. The software uses a variety of tools to track student behaviors like eye movement and body language, analyze those behaviors using AI, and identify anything “suspicious” happening during the testing session. This process, of course, involves a live video stream of the student taking the test, as well as showing the remote proctor aspects of the room the test is being taken in.  

Given this invasive protocol, Proctorio has come under heavy fire recently for violations of student privacy, and has been involved in a slew of lawsuits over its surveillance tech. You might think Proctorio is just a necessary evil. To monitor and catch test cheaters at scale, when administering tests to thousands of students, Proctorio seems to offer an efficient solution. But this isn’t actually a typical cost-benefit tradeoff; the benefits of Proctorio don’t appear to exist. A recent study shows that the sensitivity of Proctorio in actually catching cheating students should be, “… put at very close to zero.” So not only is Proctorio invasive to student privacy, it also doesn’t seem to work the way the company claims. 

 How on earth can a company rake in millions in student tuition and taxpayer dollars by selling a product that, when empirically tested, doesn’t even work?  

Software like Proctorio is implemented based on the idea that catching cheating students can be automated. Universities have ensnared themselves in a familiar trap that comes with the constant evangelism of AI by those in the tech industry. They’re falling for an empty promise that surveillance and AI technology is a silver bullet for facilitating honest student testing. By buying an automated method for monitoring students, universities can wash their hands of the meticulous process of enforcing rules themselves. As scholar Sun-Ha Hong might put it, Proctorio sells a kind of “honeymoon objectivity” that promises a world of objective and perfect student monitoring that’s superior to human administration. Surveilling students is thought to be less fallible and more precise because advanced-seeming technology of a video feed combined with AI is doing the monitoring. This of course, is a fantasy. 

It's the same pattern of AI and surveillance snake oil that so many fall for, like selling facial recognition AI as a solution to school shootings. Proctorio is selling their product by creating fears of student delinquency and marketing a necessity to surveil each student’s facial movements. This necessity for panoptic test administration, of course, is a problem that Proctorio generated themselves. Ironically, given the inefficacy of their software in meaningfully preventing cheating, this “problem” is not even one that Proctorio can solve.  

 
Previous
Previous

The Misuse of Surveillance Tech for Abuse: The Case of Apple’s AirTags

Next
Next

The Risks of AI Discrimination in Hiring