Unlike virtually every other consumer-facing chatbot on the market, McGraw Hill’s AI Reader bot doesn’t come with a cutesy name and a bubbly persona. Ditto the company’s new writing assistant, Writing Assistant.
That’s very much by design, according to Chief Data Science and AI Officer Dylan Arena. The textbook publisher-cum-education tech platform wants to avoid personifying the AI it offers its impressionable audience of students.
“There are tremendous downsides to encouraging users, especially young users, to develop what are essentially parasocial relationships with artificial people,” Arena said. “People at first thought that I was a little bit of a Chicken Little on that front. But unfortunately, there have been all kinds of horrifying stories.”
Red lines like that are important when you’re outfitting a 137-year-old household name with a nascent, error-prone technology, especially one that caters to students who need to trust it for schoolwork. It’s a balance that a lot of schools and teachers are trying to navigate at the moment.
“We don’t have brand permission to, like, throw spaghetti at kids, which a lot of startups are doing,” Arena said. “They’re like, ‘Hey, maybe this will work, and maybe it won’t work, but either way, I’m taking a shot.’”
Educators are left to sift through plenty of that spaghetti. With students being some of generative AI’s most infamous and enthusiastic early adopters, many have come to terms with the fact that they can’t hold back the tide of AI, according to Victor Lee, associate professor at Stanford University’s Graduate School of Education who studies AI and education.
“I wouldn’t quite say ‘acceptance,’ but it’s probably more like the acknowledgement stage,” Lee said. “But there is still a sense of uncertainty about how and when to use it effectively in schools.”
Zafer Unal, education professor at the University of South Florida, has been researching teacher attitudes toward AI use in creating classroom materials. He found that educators are concerned about data privacy and tools that meet grade standards.
Unal started a free resource called TeacherServer, a platform built on a privately hosted open-source model that encompasses more than 1,000 tools for teachers planning lessons. Unal said it currently has almost 3 million users signed up.
But TeacherServer’s tools are geared for back-end lesson planning, not for engaging students with AI directly. Unal said more research needs to be done there, but that students will inevitably use it, especially as future employers come to expect fluency in the tech.
Keep up with the innovative tech transforming business
Tech Brew keeps business leaders up-to-date on the latest innovations, automation advances, policy shifts, and more, so they can make informed decisions about tech.
“We need more experimentation, and we need to actually see the data. We need to have actual teachers using it and measuring it,” Unal said.
Many educators are still figuring it out as they go and developing individual rules on the fly, according to McGraw Hill’s Arena. Not much exists yet in the way of proven standards across the board.
“What I have not yet seen is any kind of coalescence of best practices at the practitioner level,” Arena said. “It’s starting to shake out. And there are groups like TeachAI that are coming up with guidance, documentation, and those kinds of things. But it really is bottoms up. It really is educators being thoughtful and creative and persistent and willing to fail [and] try things out.”
Arena said McGraw Hill only pushed out an AI tool after “learning science and pedagogical experts are convinced that that will actually improve the learning experience meaningfully for teachers, for students, for administrators.”
Writing Assistant is designed to provide feedback similar to a hypothetical comment from a teacher, roaming between desks and peeking over students’ shoulders, Arena said. Maybe, “Good start, but try to find something specific from the text that supports your claim,” he said.
AI Reader is supposed to help students break down complex passages or cue up pop quizzes as they navigate one of the company’s e-books. Arena said the company conducted tests in which they introduced AI Reader partway through a course and found that it increased reading time afterward.
The goal is not to sideline or replace teachers, Arena said; a previous hype wave around free massive online open courses (MOOCs) in the early 2010s showed the folly in thinking that social instruction can be supplanted by technology.
Much of McGraw Hill’s development process involves curbing the LLM’s natural tendency toward over-friendliness and maximizing engagement, Arena said. While AI labs are now starting to come out with tools trained specifically for educational purposes, Arena said they could push even further into redesigning these models from the ground up.
“I would love for the foundation model providers to push further in that direction and even maybe disentangle the goal of the tool when it’s in learning mode from the underlying and broader goal of all of the LLM providers, which is to get people to use the LLMs more and more and more,” Arena said.