When COVID-19 shut down in-person education a year ago, college and university staff worried that students would cheat on their remote exams. Many required students to download and install proctoring software that “…is effectively indistinguishable from spyware,” as the Electronic Frontier Foundation (EFF) warned. These technologies surveil students through their webcams, microphones, keystrokes, browsing history, and log files. The resulting data is fed into an AI, which flags “suspicious” behaviors.
To understand what this is like for students, I spoke to members of a student advisory board (SAB) associated with Rosalyn.ai, a remote proctoring solution (full disclosure: I’m an investor in Rosalyn.ai). This SAB, the first of its kind in the remote proctoring space, offers a student perspective on privacy, inclusion, equity, and other important issues.
We have learned from students that the proctoring AIs frequently mistake their innocent actions for cheating. Even worse, these facial recognition technologies disproportionately flag students of color, students with disabilities, and those who wear religious garb. Moreover, students without access to high-bandwidth internet — often living in under-resourced communities — have found that proctoring solutions crash on them and thereby interfere with their tests or lead to accusations of cheating.
Luz Elena Anaya Chong, an SAB member who studies international business at Texas State University, critiqued the fear-based approach inherent to most remote proctoring solutions. “If you look away, you’ll get flagged. If you do this, you’ll get flagged,” says Elena. She described the experience as “nerve-wracking.”
Imagine taking a two-day, 12-hour bar exam with a remote proctor. You were already nervous, but now your webcam is recording and relaying your every move, noise, and keystroke to an AI that tries to spot cheating. You’re not sure what will trigger it. Looking at the ceiling in deep thought might do it. If you’re diabetic, checking your glucose monitor could raise suspicion. Standing up to use the restroom could raise an alert, which is why some law students have chosen to urinate on themselves during the bar exam rather than risk an accusation.
Students tell me that situations flagged by the AI proctor are often sent to a judicial board, which can take weeks if not months to see the case. The student may be forced to accept a failed grade in the meantime, hurting their efforts to land an internship, fellowship, job, or other opportunity. It is reminiscent of the Red Scare — once the AI proctor makes an accusation, the damage is already done, even if the student is innocent.
In a perfect world, faculty would give proctor-free exams protected by an honor code. In reality, if there’s no proctor — or the system is too lax — the students who don’t cheat fall behind and feel punished for doing the right thing. Jessica Ramses ENG22, a systems engineering student at the University of Pennsylvania, says students want to be in an environment where “they can do their best and not fear that others are getting ahead in unethical ways.”
Instead of listening to students’ concerns, many proctoring platforms have intimidated critics with legal threats and copyright takedown notices. No wonder students and faculty across the U.S. have boycotted these proctoring solutions. There seems to be no space for dialogue with the vendors.
Thus, this “Ed Scare” is at an impasse. Cheating happens, especially in remote learning, but students shouldn’t spend their exams in fear of triggering an AI proctor that has no conscience and no intrinsic concern for their education.
It’s About More Than Cheating
Proctoring vendors made little or no effort to consult students or work through edge cases before selling their technology into hundreds of colleges and universities reeling from COVID-19. Clearly, these solutions have failed to meet faculty and student needs let alone respect their civil rights. Ultimately, the purpose of proctoring technology is not to stop elite college students from cheating. It is to expand access to education and testing while preserving the value and integrity of formal degrees. Remote proctoring is about giving students “the right to learn,” says SAB member Dylan Singh, an accounting student at the University of Southern California.
The right to learn varies by population and environment. SAB member Clara Brewer, for example, noted that proctoring technology is not equipped for hands-on, laboratory tests — the kind she does as a neurobiology, physiology, and behavior major at the University of California, Davis. It would be backwards to design STEM tests for the limited abilities of proctoring technologies rather than the other way around. Vendors have yet to meet the actual needs of faculty and students like Clara.
Dylan worries about the grade school children he tutors in Los Angeles near USC. Education and proctoring solutions, made for big-budget universities, are too dry and disengaging for young learners. Cheating isn’t the issue here — rather, the issue is to make children feel invested in learning and testing outside a traditional classroom.
Clark Chung, who studies naval architecture and marine engineering at the University of Michigan, emphasized that testing environments change across borders. A South Korean student, connected to some of the world’s best internet infrastructure, is in a different situation than a student in Afghanistan working with a 2G connection and no webcam. Should Afghani students be denied a coding certificate just because they don’t have high-speed internet and a new computer?
Efforts to democratize digital education will falter until students from all places and backgrounds can test on an even playing field. If technologists get proctoring right, students worldwide could learn lucrative trades, earn marketable certifications, and transform their lives. What will that take?
Digital Proctoring, With Dignity
Vendors in this space have caused serious harm to their industry’s reputation and the students who have been forced to go along with their missteps. To earn their trust again, vendors should adopt several practices:
1. Give students control over their personal data. Remote proctoring has a conundrum. On the one hand, students feel nervous about being video recorded and having that data stored in the cloud by a vendor. It feels invasive. On the other hand, vendors need that data to train their AI and improve the proctoring experience. AI can learn the difference between a water bottle and a cheat card, but not without practice content from real exams.
A compromise may resemble the “right to be forgotten” clause in Europe’s General Data Protection Regulation (GDPR). Tell students exactly what data is collected, where it’s stored, how it’s used, and when (or if) it will be deleted. Give students the choice to delete or obtain their data if and when they choose.
2. Make proctoring less contingent on Wi-Fi and computing power. Students who have spotty wi-fi or slower computers suffer most from proctoring technology. Usually, these students live in rural or urban areas and lack access to university-grade computing resources. Their exams are more likely to be disrupted and flagged for cheating.
If that happens in the U.S., how are these solutions supposed to improve access to learning in, say, Central Asia and Sub-Saharan Africa, where cell towers often provide internet connectivity? Proctoring solutions need the ability to run on any web device at speeds as low as 300 kbps per second — barely enough to download two text emails per second.
3. Keep humans in the loop. Live proctoring isn’t scalable. Proctors aren’t paid particularly well and after several hours, they grow fatigued. AI doesn’t suffer those disadvantages, but it doesn’t understand context. There are many reasons why a student might gaze away, glance at the ceiling, or move their cell phone.
Rather than assume students are guilty, vendors should bring these instances to the attention of a human in the loop without disrupting the student’s exam. The human proctor can dismiss innocent behavior and mark instances where an AI has behaved in discriminatory ways. After the exam, the software can ask students to comment on any unusual event while it’s fresh in their memory. The proctoring AI should start a conversation, not a trial.
Sooner or later, the “Ed Scare” will end the way many scares do: through courage, empathy, and innovation. The students have spoken. Now it’s time for vendors to listen.
Julie Allegro Maples W90 WG04 is co-founder and managing director of FYRFLY Venture Partners, a seed stage venture firm investing at the intersection of “data + intelligence.” She is also the founder of the V Foundation Wine Celebration which has raised $118 million for cancer research since inception.