Evidence for Learning: Evidence to practice: Beyond an effect size

Evidence to practice: Beyond an effect size

Ultimately, the value of the evidence lies in how we make sense of objective data in a range of contexts.
Author
E4L
E4L

In this blog, we take a closer look into the use of evidence’ and what it means in practice, particularly looking at unpicking the meta-analyses within the Teaching & Learning Toolkit (the Toolkit) and how school leaders and teachers can use the research.

Blog •4 minutes •

The value of evidence beyond a summary effect size

Reading a recent article by Pierre-Jérôme Bergeron, who is an Assistant Professor in the Department of Mathematics and Statistics at University of Ottawa, in the McGill Journal of Education has provided further insights into the work of meta-analyses. He provides food for thought for our thinking on meta-analyses and the Toolkit’s work.

Bergeron’s insights take the perspective that we must outline the risks in combining studies of similar outcomes measures when drawing conclusions about the relative effectiveness of an intervention:

We agree that there are legitimate concerns in combining results of different studies into a single effect size and, there is the possibility of ignoring important differences across studies. To address this challenge, the Toolkit’s padlock rating system tries to make transparent how studies are included, providing a summary of the strength of the evidence, including the quality and quantity of studies included in any given Toolkit approach.

For example, within the approach of collaborative learning, which has an average of five months’ learning progress (as calculated from a mean weighted effect size) there is high evidence security, presented by four padlocks.

Evidence strength

What this means is three or more meta-analysis from well-controlled experiments were involved to reach a four-padlock rating.

Whereas, if we look at the global studies on physical environment, it has a one padlock rating with on average, zero months of learning progress. This indicates that non-systematic reviews with quantitative data have been undertaken.

In the context of helping teachers use evidence in practice, this allows us to move beyond the simplistic, Does it work or not work?’ to the far more sophisticated, How well does it work in a range of settings’.

Situating evidence in complex contexts

Context is important! As we’ve discussed in previous blogs, the Toolkit can only tell us high level conclusions about what has worked’. The effect size of each Toolkit strand presents a generalizable finding that can be used for teachers and educators to make informed decisions for their students. The impact findings do not amount to a holy grail’ for what works in education, overlooking important contextual variables and reducing a complex ecosystem like a classroom, to figures and digits. Hence, situating the evidence in context is paramount.

How teachers use evidence is crucial. To help schools make such contextualisations, the Evidence for Learning team worked with the Melbourne Graduate School of Education at the University of Melbourne to put together summaries of Australasian research and evidence summaries on each of the 34 teaching and learning approaches in the Toolkit. This information is potentially valuable for school leaders and teachers in supporting the interpretations of the effectiveness of an intervention e.g. Collaborative learning in the Australian and New Zealand context.

As Sir Kevan Collins, CEO at the Education Endowment Foundation puts it, teachers and school leaders play a key part in this shift in the culture of evidence, and in so doing builds the profession:

In following the practice of evidence-based medicine, where clinicians seek out the best evidence to treat their patients, we hope that the Toolkit empowers teachers to use it to inform and generate better evidence that situates in the context in which they teach. The evidence presented in the Toolkit should not be seen as modus operandi (‘What works’).

For instance, a school leadership team might consider what the evidence means in their context – Effective for what purpose? For whom and what goal’.

These go beyond questions about the effect size of any approaches.

Dr Pauline Ho is an Associate Director at Evidence for Learning. In this role, she manages the Learning Impact Fund, a new fund building rigorous evidence about Australian educational programs.

Dr Tanya Vaughan is an Associate Director at Evidence for Learning. She is responsible for the product development, community leadership and strategy of the Toolkit.