KU researchers studying efficacy of automatic recording and AI analysis of children's language use and development
LAWRENCE — Large language models such as ChatGPT can accurately process adult speech in relatively quiet settings, but because they are trained on adult speech, they struggle to process children’s speech. When those children are speaking in noisy environments such as science museums, schools and busy homes, it can be even more difficult to accurately measure verbal interactions between young children and adults.
Scientists at the Juniper Gardens Children’s Project, a KU Life Span Institute center in Kansas City, Kansas, have been testing how artificial intelligence (AI) could be applied to measure verbal interactions unobtrusively and automatically in a variety of settings, including at area child care or early education programs, and previously at locations such as Science City at Union Station.
A $174,000 grant from the National Science Foundation (NSF) that began collecting data in area early education and child care settings this fall expands on prior NSF-funded research at KU. For both projects, KU researchers are collaborating with computer engineers at the University of Texas-Dallas to explore the use of digital recorders to capture verbal interactions between young children and adults, then process the recordings to identify teacher talk that includes social-emotional related words and phrases.
Kathy Bigelow, associate research professor at Juniper Gardens, is leading the most recent grant, together with John Hansen, professor of electrical engineering at University of Texas-Dallas, and with co-investigator Dwight Irvin, a former Juniper Gardens researcher now at the University of Florida. The project, “Social-Emotional Analysis of the Language Environment (SEAL): Key Word & Phrase Spotting in Early Childhood Care Setting,” uses advanced speech processing algorithms to automatically capture the words and phrases from teachers in early education settings in their everyday interactions that promote children’s social-emotional learning. The researchers are assessing if the automatic measurement of teachers’ talk from audio recordings in toddler classrooms could represent a transformative approach to help teachers better support children’s social-emotional development, such as understanding, managing and expressing emotions.
Bigelow said that there is a need for tools that can provide data-based feedback to teachers on talk that focuses on social-emotional learning.
"Like a ‘Fitbit’ for counting steps, we envision SEAL data can be used by teachers, coaches and early childhood programs to monitor classroom social-emotional talk and build capacity to improve social-emotional competence in young children,” Bigelow said. "Such a tool could better ensure young children gain the skills to ready them for school, relationships and life."
Most existing methods that record and analyze speech in settings such as preschools and science museums have been slowed by the need for humans to evaluate who is speaking or determining how language is being used or understood.
KU Research Professor Jay Buzhardt, associate director at Juniper Gardens, and Irvin recently concluded the prior $300,000 NSF grant, which studied use of language environment analysis, or LENA. The devices are small, child-safe recorders that children wear for a day and work like a pedometer for speaking. Like Bigelow, Buzhardt and Irvin also worked with the University of Texas-Dallas team, led by Hansen, to process the recordings using AI systems they developed that analyze the audio. The method identifies speakers and key words, and counts words and conversational turns.
“It’s work that would take 100s of hours for human coders to complete,” Buzhardt said of ongoing studies exploring the method. “Using the AI system that Dr. Hansen’s team developed, we sought to use this as an unobtrusive method of measuring visitor engagement with exhibits at Science City and in classrooms and other environments.”
In one of several published findings from the research, scientists demonstrated the efficacy of analyzing streams of audio while maintaining the security and privacy of children and adults and used the information to provide feedback for teachers in school settings.
At Science City, where thousands of students and families engage with 300 hands-on indoor and outdoor activities, museum educators and staff are limited in what they can learn about how people are engaging with exhibits. Staff sought better ways than surveys and observations to measure interactions happening when visitors spoke with classmates, friends and family among the exhibits.
The KU research team in turn was interested in not only finding a way to passively measure how visitors engage with science museums by examining parent-child interactions, but to also use the interactions for research about children’s language development. The KU researchers worked with museum staff to test LENA for recording interactions and to analyze speech through the AI application.
After adjusting the AI system for use in a large public space like Science City, the system’s accuracy in coding parent-child interactions such as conversational turns, child initiations and adult initiations ranged from 85%-95%, which is at or above the expectations typically set for human observers in social science research, Buzhardt said. The researchers presented study results earlier this year.
Buzhardt noted that young children’s learning during and before preschool is primarily influenced by the number and variety of opportunities that they experience through nonstructured or "informal" learning settings and activities. About 80% of preschool children’s time outside of school involves such informal learning experiences.
He said the studies help researchers understand how parents or teachers and children interact with one another in informal contexts such as science museums and helps evaluate engagement more accurately and frequently. Additionally, affordable tools that can machine code interactions in natural environments can provide immediate feedback to parents and educators about their language environment and where it can be enriched.
“Children’s success in school and in life depends on whether they have learned certain social and emotional skills in their early years,” Bigelow said. “We know this is a challenge in early childhood education and care settings.”