|
|
The views expressed on this page are
solely
those of the author and do not
necessarily represent the views of County
News Online
|
Psychology Today
The Hechinger Report
PROOF POINTS: A crowdsourcing approach to homework help
An early experiment in asking teachers to write helpful hints shows promise
by Jill Barshay
August 3, 2020
Kids hate doing homework. Parents hate nagging about it. Teachers hate
grading it. There are even ongoing debates among educators about
whether all the assignments help students learn much. Here’s one
way that homework might be more effective: crowdsourcing help from
teachers.
Neil Heffernan, a professor at Worcester Polytechnic Institute in
Massachusetts, came up with the idea in 2016 after a Maine middle
school math teacher, Chris LeSiege, uploaded hundreds of hints he had
written for solving textbook problems to Heffernan’s free online
homework help website, ASSISTments. When LeSiege’s 7th grade students
felt stuck at home, they could click on a help button for each problem
and get a tip or a reminder of something LeSiege had discussed in class.
LeSiege believed his prewritten hints and explanations were helping his
students and Heffernan wondered if all students might benefit from
having a virtual tutor hover over their shoulders during homework time.
Since not every teacher has the time or inclination to come up with
hints for every homework problem, Heffernan was curious whether hints
written by one teacher, with a particular way of explaining things and
a unique style of instruction, might still help students who are taught
differently by another teacher.
So Heffernan and a graduate student decided to create a system that
could crowdsource teacher hints and then conducted an experiment to see
if these hints were useful.
To test them, they set up a battle between hints and answers for each
student. When students clicked on a help button, they sometimes
received a teacher’s hint, which could be a pop-up text explanation or
a video of how to solve the problem. (See examples in the accompanying
illustration.) But other times, the help button simply revealed the
answer, like when you look up a problem in an answer key in the back of
a textbook.
In early trials in which 11,000 students completed more than 370,000
online homework problems, the teacher hints seemed to be more
effective. Students who had access to teachers’ hints answered the next
problem correctly on their own without any support 58 percent of the
time. In comparison, students who did not have access to the hints got
the next problem correct 54 percent of the time. The difference wasn’t
a big one but it was statistically significant.
“Teachers [can] stop wasting their time writing and rewriting
explanations for the Pythagorean theorem,” said Heffernan. “Instead
[they can] spend their time motivating and challenging their kids to
not fall off the wagon during these COVID times.”
Computerized motivational messages from a random teacher, Heffernan
suspects, wouldn’t be as effective. “You really need to hear your own
teacher say, ‘Hey you’re not working very hard, Jill. I see that you
haven’t done anything in three days’,” Heffernan explained to me.
The peer-reviewed study, “Effectiveness of Crowd-Sourcing On-Demand
Assistance from Teachers in Online Learning Platforms,” is slated to be
published in August 2020 in the forthcoming proceedings of the Learning
at Scale conference.
A crowdsourced system like this would depend on different teachers
wanting to assign the same homework problems to their students. It also
relies on a cadre of experienced teachers who can make astute guesses
as to what a typical struggling student might need to solve a problem.
Individual students might need another hint entirely. Perhaps one is
making a basic computational error. Another might be forgetting how
many degrees are in a circle or doesn’t even understand why that fact
might be important to solve a problem on calculating angles. Only a
human tutor, sitting next to a student during homework time, would be
able to figure out the proper hint that an individual student needs in
a particular moment.
To build the crowdsourced hint system, the researchers paid 13 teachers
to write hints. “We selected teachers who both wanted to create
feedback for their students and we trusted to create quality content
based on our knowledge of their past work,” said Heffernan by email.
Even some unpaid teachers became interested in writing homework help.
Eight of them created more than 200 hints each for their own students.
Altogether, nearly 150 teachers — out of 5,000 who were using the free
homework site — created more than 40,000 hints for more than
25,000 problems during the three years of this experiment, from 2017 to
2019.
The faceoff between hints and answers was also interesting because
there’s a lot of disagreement among experts on how best to give
feedback to students. Some research has found that wordy explanations
can be confusing and do more harm than good. Others have found that
simply telling the student the right answer can be quite helpful.
Crafting a clever hint that prods a student to think and work out an
answer is an art. Too often teachers unintentionally give away answers
in the hints. Teachers in this experiment found that it was simply
easier and more straightforward to explain, step by step, how to solve
a problem.
The next step in this line of research is to see if this kind of
teacher assistance is actually helping students to complete more math
homework instead of feeling stuck, getting discouraged and giving up.
If it proves effective, Heffernan says that the idea of teacher
crowdsourcing could potentially be added to any instructional website,
not just the ASSISTments site that he created.
In the meantime, Heffernan is working fast to build a new homework
system that can be used in real time during coronavirus remote
instruction so that teachers can see exactly where students are in
their independent, practice work.
I’ve often written about how ed tech often hasn’t proved effective in
rigorous research tests. It’s interesting to see an ed tech developer,
whose work has previously succeeded in randomized controlled trials,
still looking for new ways to allow teachers’ ideas rise to the top.
|
|
|
|