Many conversations about generative artificial intelligence in higher education are about academic integrity and threats to learning. Those issues are serious and challenging. But there is a generative side to generative AI, and some faculty, like Magy Seif El-Nasr, Professor and Department Chair of Computational Media, are helping both students and their fellow educators explore the new possibilities that generative AI offers.
In Fall 2023, Seif El-Nasr taught a new course on interactions between artificial intelligence systems and human lives: CMPM 80H (Human-Centered AI). The purpose of the course was manifold: to help students identify and understand the roles of the various AI systems at work in their lives; to provide them with the theory and vocabulary to discuss the impact, ethics, and bias that come with AI; and to give them time to develop a set of research skills that will allow them to further their understanding of AI.
Seif El-Nasr designed the course to adapt to student needs. At the beginning of the quarter, she said, “I surveyed the students’ use of AI, especially ChatGPT, etc., to see how they view it, perceive it, and use it.”
Each week, students engaged in hands-on activities with and about AI. Through this guided exploration, using a human-centric theoretical framing, they were able to learn the basic functionality of, and think critically about, a wide range of AI tools — from January AI, a predictive AI that uses nutrition data to help users manage their blood sugar, to Midjourney, the art automation system that produces images and video from user prompts. In each activity, students were able to leverage what they had learned from lectures and readings to better understand the technologies they examined.
Students also conducted original research on their fellow students’ use and perception of specific features of a smaller set of commonly used AI tools: Otter.ai’s capacity for accents, dialects, and languages other than English; ChatGPT’s potential for supporting users’ problem solving, critical thinking, and creativity; and the ways in which users’ expectations of Alexa have changed over time.
Seif El-Nasr expected students to treat AI (in general) not only as a subject of inquiry, but also as a tool for their own use. “I constructed assignments to allow them to use it but think about its use as a companion rather than a replacement of work,” she said. “For example, when they wrote arguments and essays, they could use GenAI as a support tool for writing. But I then emphasized iterative writing through constructive feedback and iteration, so they needed to explain the arguments made through their essays, and tweak the writing based on comments and feedback I gave them.”
At the same time, Seif El-Nasr was working on two National Science Foundation-funded projects focused on the use of generative AI tools to enhance learning. In one, she and her fellow researchers employed a constrained large language model to help learners develop skills in both qualitative and quantitative data collection and analysis; in the other, she used an educational game to represent learners’ problem-solving processes and present them to other learners such that they were encouraged to reflect on their own processes. “GenAI opens up many possibilities that can help students through their work,” Seif El-Nasr said. “We have been building tools on top of GPT-4 to allow us to use the power of GenAI in education, particularly in coaching and practicing.“
Seif El-Nasr and her colleagues have also begun using AI for continuous course improvement and faculty development in the Learning Engagement Teams (LET’s) project. In the LET’s model, students engage (anonymously) in reflective conversations about their learning with an AI agent that is designed with specific prompts based on the week’s lecture materials. Data from those reflections are analyzed to produce actionable recommendations for instructors. While instructors have long gathered feedback from their students to improve their practice, AI has the potential to make the process more frequent, more personalized, and less labor-intensive. Moreover, because solicitation and analysis of feedback are mediated by an AI, El-Nasr and colleagues were able to make the process anonymized, which made it carry less bias than traditional approaches.
Taken together, Seif El-Nasr’s efforts show both the promise and the risk of generative AI for teaching, learning, and life. “I think students are not very critical of GenAI’s content generation,” she said. “They take it for granted that the information they receive through their interactions with Gemini or ChatGPT is the truth. However, there are a lot of issues with Gen AI, especially around hallucinations. One challenge is how to allow students to use Gen AI but at the same time be critical and reflective of their practice.”
Story originally found on the Teaching and Learning Center.
This block group hides your featured image, remove this block group to show your featured image again.