During the Jan. 27 Senate meeting, students discussed how Wellesley’s Accessibility and Disability Resources (ADR) office has started to use an AI note-taking software instead of employing student note-takers. The software, Glean, records lectures, providing students with a transcript enhanced by an AI-generated outline for organization. The platform also allows students to integrate slides, images and student-generated notes into their study materials. The platform is becoming increasingly popular among colleges, with Glean’s website boasting Dartmouth, University of Michigan and Trinity College Dublin among the 750 institutions utilizing this tool.
At first glance, ADR’s actions appear to be a step in the right direction: Wellesley is joining the ranks of peer institutions to provide a resource to support student learning. However, the move has the potential for more harm than good — it directly threatens student employment and raises a host of ethical issues that harm both professors and students. ADR should stop using AI software for note-taking accommodations, and instead continue to provide peer notetakers.
First, ADR moving to AI software directly affects student notetakers. There is already a striking shortage of on-campus jobs, with many students who have Work-Study as part of their financial aid package struggling to find a job to pay that expected contribution. This problem is a combination of two issues: first, there are more students on Work-Study than there are on-campus jobs, and second, securing an on-campus job often requires a connection. This second issue especially impacts underclassmen and transfer students who are new to campus. The student notetaker job, for the most part, provided a solution to this second issue.
Generally, the professor announces to the class that there is a note-taker position available, and it is usually a matter of first-come first-serve. When I was in my first-year, working as a student notetaker was often the only way my friends and I could make money on-campus. By the time I was a sophomore, I had amassed the connections to find better-paying opportunities with steadier hours, but my ECON 101 note-taking job kept me afloat in my first two semesters at Wellesley.
Of course, the student note-taker job was not perfect — merely $100 at the end of the semester is a joke of a payment. However, it was at least an on-campus job. Phasing out this student position demonstrates a lack of consideration from college decision-makers who did not feel the need to consult students. It was not enough to underpay students for their labor — now they must eliminate one of the few consistently available on-campus jobs.
Second, Glean presents ethical issues in the classroom. Students do not have to reveal their notetaker accommodations to their professors as a matter of medical privacy. However, this poses a problem: professors and classmates might not know if they are being recorded. According to Massachusetts law, all parties must give consent before recording an in-person conversation. However, if classmates or a professor do not give consent, the student with accommodations cannot learn properly. This puts classmates and professors in an uncomfortable situation.
Wellesley’s smaller class sizes lend themselves to more discussion-based learning. If a student shares a personal story during class or asks a “silly” question, there is now a saved recording of that moment. A human notetaker would have the discretion to decide between omitting the personal details a student shares versus the overall takeaway from the discussion. An AI software that provides a transcript of the entire class cannot do this.
In addition to the lack of consent from those who may not want to be recorded by an AI software, other issues may arise. Some professors are against their intellectual property being inputted into AI, explicitly stating in their syllabus that students are not to put problem-set questions into ChatGPT for this reason. However, what will happen when an entire lesson — an entire course — is recorded by AI? Also, in a political climate that is increasingly hostile to academia, many professors are against their lectures being recorded entirely. Additionally, if a course does not permit the use of AI assistance, a student using AI to fulfill their notetaking accommodation could also face issues with the Honor Code.
The ethical issues that arise also put the student with the accommodation in a difficult spot. Aside from having to navigate these challenges with classmates and professors, the student with accommodations could themself be opposed to using AI. While there may be an opportunity for a student to possibly insist on having a human student notetaker, students might feel hesitant to challenge ADR.
While we do have to acknowledge that AI will continue to play a greater role in our lives and can be used as a tool to help students, it should not be our first choice. Perhaps the AI software can be provided if a student finds the peer notetaker unhelpful and the professor consents to the recording. While the student notetaker position is not perfect, it should not be replaced entirely by AI before taking steps to improve the existing role and carefully considering the consequences for both the classroom experience and student employment.
Contact the editor(s) responsible for this story: Caitlin Donovan