ChatGPT and other generative AI technologies have been dominating adult conversations around education, yet there has been far less attention given to how children are reacting to and using the new technology.
Jason C. Yip, an associate professor at the University of Washington who specializes in learning sciences and child-computer interaction, is currently studying the ways children interact with ChatGPT and other generative AI in a safe environment at his research lab.
“We've been thinking a lot about ways that children conceptualize GPT language learning models--what do they think about it? What are the ways they like it or dislike it?” Yip says.
His research lab has also been asking children to work with other forms of generative AI that create music or art. And while the research is ongoing, so far children, much like adults, have reacted to the technology with a mixture of interest and skepticism.
Children and ChatGPT: Pros and Cons
Yip is working with children using the free version of ChatGPT, which is powered by GPT-3.5 and is more widely available than the subscription-only ChatGPT Plus, which offers access to GPT-4. Thus far, some children are impressed with the language model’s abilities. “They'll try to what we call 'break the machine,' and see what kind of questions they can get away with,” Yip says.
Other students are not impressed. “They're like, ‘Oh, yeah, it gets it wrong right here and right here,'” Yip says. “Some of them realize it can do certain things that are schoolish in some ways, but it can't really do things that they know very, very well.”
For example, Yip worked with one student who was a Pokemon expert. When that child asked ChatGPT what the most expensive Pokemon card was, the response was completely inaccurate.
“It’s really interesting, the fact that children find it both impressive that they can type these things in and the machine will output [and answer], but also the fact that children can also see the mistakes within it,” he says.
Using ChatGPT to Cheat and AI to Create
In education, much of the conversation around ChatGPT has focused on how students can use it to cheat. “Some will,” Yip says, “But other children are thinking, ‘No, it's not even good enough for Pokemon, how could it be good enough for school.'”
Yip and his colleagues also work with students and other AI platforms, including DALL·E 2, OpenAI’S AI art generator. “We had an activity where we had kids create stories, trying to see if they can match the story with AI-generated art and the kids were like, ‘It's okay.’ The problem is that the two don't match together very well.”
Similarly, when students try to create AI-generated music, they have been underwhelmed with the output. This is because they don’t have the technical skill or musical or artistic knowledge to effectively prompt the AI tool, Yip says. “You actually have to have really strong technical prowess,” he says. “The people who can actually use GPT and create art usually have a lot more technical skill.”
Developing AI Skillsets
As AI tools become more common, Yip believes we need to pay attention to how children will use the tech to learn and create. “It's very similar to how if you have any kind of tool, like a paintbrush or a hammer. A kid can hit a nail down pretty well, but a professional can do it better,” Yip says. “The answer isn't there yet for me of how they use tools like the AI generative models and the language learning models, along with what they know, to maximize the outputs and maximize their skill sets.”
How students can best learn to interact with the AI models and get the most of out of the experiences is certainly as important a question as is how educators can prevent students from using AI to cheat. It's also one that many educators will be forced to confront in the coming months and years.