My Account Details
Your chatbot is not your teacher: Choosing between two plausible paths for AI in education
Exploring the potential of using GenAI in education.
By: Dylan Arena
Chief Data Science and AI Officer, McGraw Hill
It may feel as though generative artificial intelligence (GenAI) has leapt from ChatGPT to every nook and cranny of our digital lives: the search results we see from Google, the productivity tools we use from Microsoft, and of course, customer-support chatbots all over the Web. As fascinating as GenAI is, though, this new type of AI that has driven the most excitement (and anxiety) in recent years is still very much in its infancy.
As those of us in education think about how GenAI can be used responsibly and effectively to support learners and educators, we need to think carefully about the future we want to build. Educators already see a range of benefits for AI broadly, but 70% also have concerns about its unknown impacts, according to our recent Global Education Insights Report. The unknown impact of GenAI specifically is an area where I’m both excited and anxious.
To explain my ambivalence, I’ll describe two plausible paths for GenAI in education.
The path I worry about: Personified AI
Imagine a world where every student has an AI assistant acting like a human tutor “sitting beside them” on their devices as they do their homework and study. The assistant has a warm and friendly persona as it answers students’ questions, provides encouragement, tracks learning progress, suggests next steps, and more.
This may seem like a wonderful scenario, but I worry deeply about the human tendency to interpret interactions that are person-like as being equivalent to true human interactions. We are social creatures who readily anthropomorphize. If the AI tutor is always listening, always cares, pays attention, asks us about ourselves, and laughs at all our jokes, young people—and even some older people—are likely to form emotional attachments to that AI tutor. (Adults have already formed intense emotional attachments to bots designed to act as virtual companions.)
But, of course, AI tutors are not humans, so those attachments will ultimately disappoint. If we build AI assistants that invite people (especially children) to become attached, those attachments could also make it harder (or less interesting) for people to do the hard work of building and maintaining attachments with actual people, like their peers or their teachers. And that would be a tragedy, because we know from extensive research that strong friendships protect against all sorts of developmental struggles and that strong teacher-student relationships are crucial for great learning outcomes.
Our society is grappling with various negative impacts of social-media participation on young people’s development and mental health. We should learn from those mistakes, and we should remember that education is a fundamentally social experience. Students need real interactions with classmates, and they need strong relationships with caring teachers who can guide them and help them develop as humans: teachers who can leverage their expertise in the long journey of learning, and who can model behavior like teamwork, perseverance, and responsibility.
The path of opportunity: AI as a human-enabler
The second plausible path for GenAI in education is one I’m much more hopeful about. It envisions a classroom with inquisitive students and passionate teachers—not terribly dissimilar to what you’d see in a classroom today, except that AI and other technologies are augmenting teachers’ capabilities and personalizing students' experiences in new and powerful ways. In this classroom, human relationships are central, and AI is reducing the administrative burden on teachers to free up their time to do what they do best.
Imagine, for example, a writing tool embedded in digital learning materials that lets students ask for targeted guidance and feedback throughout a short-form writing process. Or imagine a tool embedded in e-reading experiences that lets students highlight specific text and ask for an alternative explanation, simpler language, or a quick quiz on the highlighted section. A teacher walking the classroom might have time to offer that kind of support to a few students, but a thoughtfully developed AI tool can do so for the entire class. AI tools like these would empower students by scaffolding their independent learning, and the tools could then provide insights to the teacher about student questions, confusions, or successes, allowing the teacher to follow up as appropriate.
As we build any technology for classrooms, we should support the creation and maintenance of meaningful relationships between teachers and students, and among classmates. We should build AI that can do something that teachers don’t have time to do, or AI that can help them do something better or faster. Our overall design goals should always include extending teachers’ capacity, reach, and positive impact on learners. For students, we should build AI that can give timely help while they’re working independently, especially in moments where we know they often get stuck or struggle to engage deeply enough to grasp new concepts.
We are only beginning to explore the potential for using GenAI in education. As we develop new uses for it, it’s important that we keep focused on proven learning science and what we know will help students learn and grow. No doubt AI is a tool that can help, but it’s up to us to use it responsibly.
Read more from Dylan Arena in his post about how AI is helping students learn today: https://www.mheducation.com/news-insights/blog/three-ways-ai-helps-students-learn-more-effectively-today.html.