|
|
The views expressed on this page are
solely
those of the author and do not
necessarily represent the views of County
News Online
|
Dreamstime
The Hechinger Report
PROOF POINTS: Free, no frills programs lead the class in new federal study of remote learning
U.S. Education Department found only three technologies met stringent criteria for being effective
By Jill Barshay
March 1, 2021
When the coronavirus pandemic first hit in March 2020, the research
unit inside the U.S. Department of Education, called the Institute for
Education Sciences, commissioned a report to wade through all the
studies on education technology that can be used at home in order to
find which ones were proven to work. The goal was to provide a quick
guide for teachers and school leaders during remote instruction.
Almost a year later, in February 2021, the results are in: a mere three
online learning technologies have clear evidence for improving student
achievement. One helps middle schoolers with math homework. Another
improves reading comprehension among older elementary school students.
The third is an online algebra course for eighth graders.
All three were developed at universities and have no flashy graphics,
animations or games. None uses especially sophisticated algorithms to
tailor the instruction to each student, known as “adaptive” learning,
but they do give instant feedback, letting students know what they’ve
gotten right and wrong.
“We don’t have flying frogs and leaping lizards but we do try to teach
kids how to understand what they read,” said Kausalai (Kay) Wijekumar,
a professor at Texas A&M University who developed the reading
comprehension program ITSS that made the list of three. ”It’s
systematic direct instruction, using evidence-based practices. No glitz
or gimmicks.”
This doesn’t mean that everything else is bad or ineffective but it
does mean that very little education technology has been rigorously
tested in the way that pharmaceuticals or vaccines are. The Department
of Education insisted upon high research standards for this report, “A
What Works Clearinghouse Rapid Evidence Review of Distance Learning
Programs.” Classrooms or individual students had to be randomly
selected to use the education software, with researchers comparing
those students’ test results or grades with students who didn’t use it.
No small studies were allowed. Programs had to be tested on at least
350 students and in more than one school.
These tough requirements whittled down an initial list of more than 900 studies to just three remote learning programs.
“Things like Google Classroom, or Zoom or whatever your child’s
teachers are using, it doesn’t mean that they’re not effective,” said
Sarah Sahni, lead author of the report and a policy researcher at the
nonprofit research organization, American Institutes for Research,
which was hired to conduct the analysis. “It means that there is no
published study that uses a randomized design showing that it’s
effective. We haven’t done the study yet to understand. And that’s
really one of the big findings that we have here, that there isn’t
enough rigorous research on these kinds of programs.”
Because the government wanted to focus on what works, Sahni only
analyzed studies with positive results for students. This analysis
didn’t point fingers at which education software hasn’t been properly
evaluated or which education technology has been proved in studies to
be ineffective or harmful.
Sahni and her colleagues didn’t test any of the ed tech themselves or
conduct any experiments on students, but tracked down studies that were
published in the past 10 years. All of the studies took place before
the pandemic. Much of the educational software was actually used inside
school buildings, not only at home. But if it could easily be used
remotely, the researchers considered it. Researchers focused on
learning programs that students could access through a computer, not
learning games on a smartphone app.
Of the three programs that rose to the top, one of them — an online
algebra course — doesn’t exist anymore. It was developed at the
University of Nebraska and the company that marketed it, Class.com, was
acquired by Cambium Learning, which mothballed it. It had been
effective at teaching algebra to eighth graders who were strong math
students but attended middle schools that didn’t offer algebra to
eighth graders, according to this 2011 study. (Another company, Class
Technologies Inc., uses the class.com domain, but isn’t selling this
algebra course.)
The other two do exist. ASSISTments is an online homework site for
middle school math, currently used by more than 500,000 students.
Teachers assign students practice questions on the free website,
developed by Neil Heffernan, a professor at Worcester Polytechnic
Institute in Massachusetts. In addition to immediate feedback, like an
answer key, there are optional drills and hints to show students how to
solve problems step by step. A large study in Maine found strong
results for students who spent only 10 minutes a night three or four
times a week on the software.
Teachers don’t have to change their existing lesson plans or textbooks
to incorporate it. Other education software, by contrast, often imposes
its own curriculum or requires teachers to make major changes to the
way they teach.
The reading program ITSS stands for Intelligent Tutoring using the
Structure Strategy and is used by about 150,000 students, mostly in
grades three through five. It encourages students to make sense of
nonfiction passages by first identifying how the writer structured the
piece and then summarizing the main point. Before the pandemic, it was
typically used during English class once or twice a week for a total of
30 to 60 minutes. Fourth and fifth graders demonstrated improved
reading comprehension from spending time on the software every week in
two separate studies. The software is free but teachers and principals
are required to attend a few days of training and coaching, which costs
a school thousands of dollars.
I’ve heard a lot of grousing by ed tech and curriculum developers that
the Department of Education’s standards for what works are impossibly
high. Some jokingly call the Department’s What Works Clearinghouse the
“What Doesn’t Work Clearinghouse.” But I think there is value in
getting unvarnished evidence reviews from the government to help
teachers and school leaders put the hyperbolic claims of ed tech
marketers into context.
However, the department might reconsider its own hyperbolic terms. This
“rapid” review of the research to help schools during the pandemic took
more than nine months to produce, arguably too late for educators to
make much use of it.
Lead researcher Sahni told me that her team rushed the review out in
less than a third of the usual time frame of three years. That might
not have been fast enough to save students who slipped behind this
pandemic year. But it could help guide educators when they turn to
remote learning in the future.
Read this and other stories at The Hechinger Report
|
|
|
|