The positive effects of a great tutor are truly outsized and many students benefit from Brown’s robust TA program. In recent years however, demand for TAs (especially in computer science) has begun to outpace supply, resulting in long-queues at office hours, overworked TAs, and frustrated students.
We believe large language models (LLMs) have tremendous potential to alleviate some of that demand.
Yet, although many students are already using LLMs for help with conceptual problems or homework, they’re not an ideal tutor out of the box. They give answers when they should be guiding and their explanations are not grounded in course materials, leading them often to confuse students further.
This is why we’re building ATA, an LLM-powered tutor that has access to all the course materials a TA would and is structured to guide instead of give.
Ingestion refers to the chunking and embedding of relevant course materials. For cs15, relevant course materials include lecture slides, homework assignments, the course syllabus, and Q/As from current and previous iterations (Q/As are sourced from EdStem, a popular forum platform for college courses).
Ingestion is key, as it fills in the gaps of an LLM's foundational knowledge that are needed to provide an insightful response:
<aside> 💡 As of now we are able to ingest a course automatically provided the URL to the course website. This has been tested on MIT OpenCourseWare and select courses @ Brown.
</aside>
The tutor engine is the core logic of ATA — a multi-agent architecture that, provided a student's query, responds in a pedagogic manner.
<aside> 💡 The tutor engine is in development. The following is a high-level picture of its current state.
</aside>