Teams from institutions across the country are invited to participate in the XPRIZE Digital Learning Challenge each year to use their expertise to enhance or discover more efficient learning technologies. And in the 2023 competition, which ended this month, the winning CMU team included members of Pittsburgh’s own Carnegie Mellon University.
The team’s Adaptive Experimentation Accelerator was created to let teachers experiment in the classroom to determine which teaching strategies are most effective for their pupils. The tool won first place and received a $1 million award.
Members of the squad came from the University of Toronto and the University of North Carolina State. However, from CMU’s perspective, members included associate professor of the Human-Computer Interaction Institute John Stamper and director of the Open Learning Initiative Norman Bier, with assistance from Steven Moore, Raphael Gachuhi, Tanvi Domadia, and Gene Hastings. According to them, a 20-person CMU team developed the tool over the course of two years.
Bier and Stamper have previously taken part in the competition. Bier claims that winning builds on the university’s prior investments in the development of educational technologies. “This provides a demonstrable hypothesis-driven path towards development and iterative improvement, and I think it really distinguishes the way that CMU does learning science and ed-tech from a lot of other institutions,” he added.
Stamper continued by saying that the Adaptive Experimentation Accelerator was created in part using resources that were made available as a result of participation in a prior challenge, in which competitors created software for a portable device that could be used to teach arithmetic, English, and Swahili in Africa. Stamper claimed that the CMU team was finally successful in the most recent task of the tournament by utilising those tools.
The new tool’s ability to be configured by teachers to automatically locate and default to better teaching strategies is one of its key features, according to CMU experts. For instance, if a specific percentage of pupils are not reacting well to one message, they will be given an alternative message.
The fact that there isn’t always much room in the realm of education for determining which teaching strategies are most effective for pupils, according to Bier, is part of the reason tools like these are important. He was grateful that the Open Learning Initiative, which aims to enhance learning regardless of the format that is being used, offered the resources that were used to create the winning strategy.
“We need to be able to run these and replicate them at scale, and it has to be accessible,” Bier said, “if we’re going to really make progress in learning sciences.” The success of this endeavour “was largely dependent on experimentation.” The Adaptive Experimentation Accelerator will be used in coursework at CMU and by its partners going forward after the 2023 competition is over. The two experts expect that this will enhance learning for both teachers and pupils everywhere.
“The tools that were there, that we’re building out now, really opened the doors for every educator anywhere to participate in this kind of experimentation and join in helping us to better understand and improve human learning,” said Stamper.