Skip to main content

How To Build AI That Educators Want and Need

Join our conversation on how schools are adapting to AI, its benefits for educators, and how AI products should be developed with these in mind.


By: Jana Thompson, Chief Product Officer, K–12 & Dylan Arena, Chief Data Science and AI Officer
Tags: Article, Corporate

Last month, we were excited to sit on a panel at EdTech Week in New York City to talk about our approach to “Building AI That Educators Want and Need.” In our discussion with Michael Nagler, Superintendent of Schools in Mineola, N.Y., we had a conversation about how schools are grappling with the evolution of artificial intelligence, what types of AI are useful to educators and school systems, and how products should be developed with these needs in mind.

This fall, McGraw Hill announced its first two generative AI tools designed to enhance its learning platforms and provide personalized learning experiences for students. AI Reader, rolling out in select eBook titles for back to school, enhances the eBook reading experience for college students; Writing Assistant, being tested in classrooms this fall, boosts writing. As we developed them, we used several core principles as guides.

McGraw Hill’s Chief Data Science and AI Officer Dylan Arena (left) and Chief Product Officer, K–12 Jana Thompson (right) presenting alongside the Superintendent of Schools from Mineola, N.Y., Michael Nagler (middle), at EdTech Week 2024.

Here are four things we believe are critical to building AI that is useful, effective and safe for our customers.

  1. Involve educators throughout the product development process.

    Listening to our customers is always our first step in innovation. We frequently bring together panels of educators to learn about the problems they face, so we can design solutions that make their lives easier. And as we pilot new products and functionalities, we gather feedback so we can make adjustments. This is true whether we’re building AI or not.

    As we were building one of our newest GenAI tools, AI Reader, instructors shared reactions to early concepts. One said, “What I love is that this is kind of like when I have a student come to my office hours, and I just explain something a different way until it clicks." This was a totally new way of thinking about what we were building and we steered into this as a design.

    Yes, technology is evolving, but we need to stay grounded in what it is we are trying to solve. We also hire lots of former educators, who help us understand the realities and challenges of the classroom!

  2. Let learning science be your guide.

    We shouldn't build AI simply for the sake of innovation. Instead, we use what we know about how the brain works and how learning happens to build our learning tools. The best solution might not be AI, but sometimes it is.

    One thing we know from learning science is that effective deliberate practice depends upon small bouts of focused effort on specific skills, repeated frequently, with immediate feedback. This is a concept we built into our new GenAI Writing Assistant tool.

  3. Be realistic about the capabilities of generative AI.

    LLMs (large language models) will hallucinate; it’s the nature of the technology. We can’t build a tool that relies on LLMs that won’t make things up occasionally. We need to build guardrails on the technology and give it specific instructions to minimize errors. In our AI Reader tool, we limit the prompt requests that students can make. Our goal is to build trust each step of the way, so people know they can come to McGraw Hill for help and support. It’s also important that educators and students (and all of us, really) learn more about how AI works so we understand what it’s good at and what it’s not good at.

  4. Don’t replace the human element of learning. Enhance it.

    We like the Office of Educational Technology’s metaphor that AI should be used like an e-bike rather than a robot vacuum cleaner. An e-bike helps us travel faster and farther, but humans remain in control. On the other hand, a robot vacuum cleaner can help keep floors clean, but it frequently gets stuck, can’t reach the corners of our living room, and sometimes falls down the stairs. Learning is a fundamentally social activity, and effective education environments will always include a capable, caring instructor and a curious student. AI should enhance that human connection rather than replace it.

    Ultimately, our approach to developing AI learning tools should not be much different from developing any other kind of educational technology. We want to solve problems for educators and help students master the skills they need to succeed academically and in life. AI presents us with many new possibilities to better serve educators and students—but we need to be sure we get it right.

To learn more about how we approach AI at McGraw Hill, click here: https://www.mheducation.com/our-ai-approach.html