CU-built software uses big data to battle forgetting
Computer software similar to that used by online retailers to recommend products to a shopper can help students remember the content they’ve studied, according to a new study by the University of Colorado Boulder.
The software, created by computer scientists at CU-Boulder’s Institute for Cognitive Science, works by tapping a database of past student performance to suggest what material an individual student most needs to review.
For example, the software might know that a student who forgot one particular concept but remembered another three weeks after initially learning them is likely to need to review a third concept six weeks after it was taught. When a student who fits that profile uses the software, the computer can pull up the most useful review questions.
“If you have two students with similar study histories for specific material, and one student couldn’t recall the answer, it’s a reasonable predictor that the other student won’t be able to either, especially when you take into consideration the different abilities of the two students,” said CU-Boulder professor Michael Mozer, senior author of the study published in the journal Psychological Science.
The process of combing “big data” for performance clues is similar to strategies used by e-commerce sites, Mozer said.
“They know what you browsed and didn’t buy and what you browsed and bought,” Mozer said. “They measure your similarity to other people and use purchases of similar people to predict what you might want to buy. If you substitute ‘buying’ with ‘recalling,’ it’s the same thing.”
The program is rooted in theories that psychologists have developed about the nature of forgetting. Researchers know that knowledge — whether of facts, concepts or skills — slips away without review, and that spacing the review out over time is crucial to obtaining robust and durable memories.
Still, it’s uncommon for students to do the kind of extended review that favors long-term retention. Students typically review material that was presented only in the most recent unit or chapter—often in preparation for a quiz—without reviewing previous units or chapters at the same time.
This leads to rapid forgetting, even for the most motivated learners, Mozer said. For example, a recent study found that medical students forget roughly 25 to 35 percent of basic science knowledge after one year and more than 50 percent by the next year.
Over the last decade, Mozer has worked with University of California, San Diego, psychologist Harold Pashler, also a co-author of the new study, to create a computer model that could predict how spaced review affects memory. The new computer program described in the study is an effort to make practical use of that model.
Robert Lindsey, a CU-Boulder doctoral student collaborating with Mozer, built the personalized review program and then tested it in a middle school Spanish class.
For the study, Lindsey and Mozer divided the material students were learning into three groups. For material in a “massed” group, the students were drilled only on the current chapter. For material in a “generic-spaced” group, the students were drilled on the most recent two chapters. For material in a “personalized-spaced” group, the algorithm determined what material from the entire semester each student would benefit most from reviewing.
In a cumulative test taken a month after the semester’s end, personalized-spaced review boosted remembering by 16.5 percent over massed study and by 10 percent over generic-spaced review.
In a follow-up experiment, Mozer and his colleagues compared their personalized review program to a program that randomly quizzes students on all units that have been covered so far. Preliminary results show that the personalized program also outperforms random reviews of all past material.
So far, the program has been tested only in foreign language classes, but Mozer believes the program could be helpful for improving retention in a wide range of disciplines, including math skills.
It’s not necessary to have a prior database of student behavior to implement the personalized review program. Students can begin by using the program as a traditional review tool that asks random questions, and as students answer, the computer begins to search for patterns in the answers. “It doesn’t take long to get lots and lots of data,” Mozer said.
The research was funded by the National Science Foundation and the McDonnell Foundation.