Education News

Schools spend millions on professional learning. How do we know it's working?

A new study of math teacher professional development urges caution around defining effectiveness

By Ross Brenneman Published on

A new study questions the effectiveness of tools used to measure teacher professional development.

In many professional development studies, teachers are typically asked to evaluate the change in their knowledge or practices (self-learning). However, many studies also test that change with direct assessments. For this study, researchers examined the relationship between self-learning and assessments.

“Each year, millions of dollars and a great deal of time are spent on professional development, with the assumption that PD will lead to improvements in teaching and student learning,” said Yasemin Copur-Gencturk, an assistant professor of education at USC Rossier and the study’s principal investigator. “Yet not knowing what makes PD effective limits the efficient use of funds and teachers’ time.”

Another reason for the study, Copur-Gencturk said, is that many conversations around professional development may make claims about effectiveness that can’t really be supported.

Self-learning vs. assessment

Copur-Gencturk’s new study collected data from 545 teachers who participated in content-focused professional development programs. The programs were supplied by a professional development organization supported by a Mathematics and Science Partnership grant from the U.S. Department of Education.

Teachers completed an assessment before and after their programs which showed that, on average, teachers showed a moderate and statistically significant increase in their mathematical knowledge for teaching (MKT).

Participants also answered a survey provided by the researchers, to self-report learning. The survey asked teachers to rate how they felt their understanding of mathematical concepts had deepened, how their understanding of how children think and learn about mathematics had increased, and how their attention to children’s thinking and learning when planning their mathematics lessons had increased.

Comparing the results, however, revealed that the correlation between teachers’ gain scores based on the direct assessment and their self-reported gains was almost zero. For example, teachers who self-reported that they had learned more did not see that growth reflected by the direct assessment.

Such a result suggests that teachers’ self-reports and direct assessments captured different underlying constructs—in other words, the assessments might not be measuring what teachers themselves are measuring. Thus, a program that is identified as effective based on teachers’ self-reports might not be considered effective if the outcome measure was a direct assessment of teachers’ learning.

That’s a problem because many of the large-scale studies of PD effectiveness rely on teacher self-reports, suggesting teacher educators and researchers need to use caution when studying PD efficacy.

Better PD approaches

Those results weren’t surprising to Copur-Gencturk, who has long-studied how well assessments align with what they purportedly track. But she was surprised in other ways: Teachers who felt they learned more reported using strategies similar to those used by the PD facilitators in their teaching more often. But direct assessment did not detect any such differences in teachers’ learning.

“The teachers felt they had learned more when they’d already been using strategies similar to those used in the PD programs in their teaching more frequently,” she said. “But the more frequent use of teaching strategies that were not used by the PD facilitators was linked to less learning, as measured by the direct assessment.”

Copur-Gencturk urges administrators not to see the study as reason to be skeptical of professional development writ large, but rather to be methodical in selecting a good program. She urged administrators to find the PD in the area of teachers’ need that is supported by empirical research showing, first, that the PD has the potential to improve teachers’ knowledge, and second, skills that were captured by valid, reliable and meaningful measures.

Said Copur-Gencturk: “The devil is in the details.”

Read the study: “A Comparison of Perceived and Observed Learning From Professional Development: Relationships Among Self-Reports, Direct Assessments, and Teacher Characteristics” (Journal of Teacher Education)

Article Type

Article Topics