No single solution helps all students complete MOOCs
Date:
June 15, 2020
Source:
Cornell University
Summary:
In one of the largest educational field experiments ever conducted,
researchers found that promising interventions to help students
complete online courses were not effective on a massive scale --
suggesting that targeted solutions are needed to help students in
different circumstances or locations.
FULL STORY ==========================================================================
In one of the largest educational field experiments ever conducted, a
team co- led by a Cornell researcher found that promising interventions
to help students complete online courses were not effective on a massive
scale -- suggesting that targeted solutions are needed to help students
in different circumstances or locations.
========================================================================== Researchers tracked 250,000 students from nearly every country in 250
massive open online courses (MOOCs) over 2 1/2 years in the study,
"Scaling Up Behavioral Science Interventions in Online Education,"
published June 15 in the Proceedings of the National Academy of Sciences.
"Behavioral interventions are not a silver bullet," said Rene Kizilcec, assistant professor of information science and co-lead author.
"Earlier studies showed that short, light-touch interventions at the
beginning of a few select courses can increase persistence and completion rates," he said. "But when scaled up to over 250 different courses and
a quarter of a million students, the intervention effects were an order
of magnitude smaller." The study was co-led by Justin Reich of the Massachusetts Institute of Technology and Michael Yeomans of Imperial
College London. The research was conducted on the edX and Open edX
platforms, and edX has engaged in work to make the data available to institutional researchers to advance educational science at scale.
The 250 courses the researchers studied came from Harvard University,
MIT and Stanford University.
========================================================================== Failure to complete online courses is a well-known and long-standing
obstacle to virtual learning, particularly among disadvantaged communities
and in developing nations -- where online education can be a key path
to social advancement. The findings have added relevance with so much
education around the world taking place online during the COVID-19
pandemic.
"My advice to instructors is to understand and address the specific
challenges in their learning environment," Kizilcec said. "If students
have issues with their internet connection, you can't help them overcome
them with a self- regulation intervention. But if students need to go to
bed on time in order to be awake for a morning lecture, or they need to
plan ahead for when to start working on homework in order to have it ready
to hand in, then a brief self- regulation intervention can in fact help students overcome these obstacles." Previous, smaller-scale research, performed by Kizilcec and his co-authors as well as other scholars, found
that goal-setting interventions such as writing out a list of intentions
at the start of the class improved students' course completion rates.
In this study, the researchers explored the effects of four interventions:
* plan-making, where students are prompted to develop detailed
plans for
when, where, and how they complete coursework;
* a related activity in which students reflect on the benefits
and barriers
of achieving their goal, and plan ahead about how to respond
to challenges;
* social accountability, where they pick someone to hold them
accountable
for their progress in the course, and plan when and what to tell
them; and
* value-relevance, where they write about how completing the course
reflects and reinforces their most important values.
For the first three interventions, involving planning ahead, the
researchers found that the approach was effective in boosting engagement
for the first few weeks of the course, but the impact dwindled as the
course progressed. The value-relevance intervention was effective in
developing countries where student outcomes were significantly worse
than others, but only in courses with a global achievement gap; in other courses, it actually had a negative impact in developing countries.
==========================================================================
The researchers tested whether they could predict in which courses an achievement gap would occur, in order to decide where the intervention
should be added, but found it extremely difficult to predict.
"Not knowing if it will help or hurt students in a given course is a
big issue," he said.
The researchers attempted to use machine learning to predict which interventions might help which students, but found the algorithm was no
better than assigning the same intervention to all students.
"It calls into question the potential of AI to provide personalized interventions to struggling students," Kizilcec said. "Approaches that
focus on understanding what works best in individual environments and then tailoring interventions to those environments might be more effective."
The researchers said their findings suggest that future studies should
be designed to consider and reveal the differences among students,
in addition to studies assessing overall effects.
The paper was co-authored by Christopher Dann of Carnegie Mellon
University, Emma Brunskill of Stanford University, Glenn Lopez and
Dustin Tingley of Harvard, Selen Turkay of the Queensland University of Technology and Joseph J.
Williams of the University of Toronto. The research was partly funded by
the National Science Foundation, a Stanford Interdisciplinary Graduate Fellowship and a Microsoft Faculty Fellowship.
========================================================================== Story Source: Materials provided by Cornell_University. Original written
by Melanie Lefkowitz. Note: Content may be edited for style and length.
========================================================================== Journal Reference:
1. Rene' F. Kizilcec, Justin Reich, Michael Yeomans, Christoph Dann,
Emma
Brunskill, Glenn Lopez, Selen Turkay, Joseph Jay Williams, Dustin
Tingley. Scaling up behavioral science interventions in online
education.
Proceedings of the National Academy of Sciences, 2020; 201921417
DOI: 10.1073/pnas.1921417117 ==========================================================================
Link to news story:
https://www.sciencedaily.com/releases/2020/06/200615152116.htm
--- up 20 weeks, 6 days, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)