At Evidence for Learning we are focused on helping educators to make decisions that are evidence informed. Evidence informed decisions are when you combine your professional judgement with evidence. Also important to evidence informed decision making is the gathering of practice-based evidence (Bryk, 2015). Practice-based evidence includes quantitative and qualitative evidence and answers the questions:
- Has there been an improvement in students’ learning? (Hattie, 2015)
- What are the active ingredients involved in the implementation of the approach that worked in my setting and how did they work? (Sharples, 2013).
Gathering these types of evidence from your classroom and school means that when you make a change you will know if this has improved students’ learning. If the change has led to an improvement, then you can embed this change in your school. If little or no improvement in outcomes occurs, you will decide to omit the change and try something else (Evidence for Learning, 2018b).
When you are looking to try something new, based on a specific area of need you are trying to address in your school, that is when you can turn to the Teaching & Learning Toolkit (the Toolkit) (Education Endowment Foundation, 2018). The Toolkit helps combine your professional judgement with evidence. The Toolkit synthesises international and Australian research, and presents a wide range of educational approaches, summarised in terms of:
- The average impact on achievement (months’ impact)
- The strength of the evidence
- The cost
The Toolkit is provided free of charge and is regularly updated. It was updated in October 2018 with new research after an international literature review, a process that occurs every six months. Sometimes this results in changes to some of the Toolkit headline figures, such as the months’ worth of learning progress.
The ‘What should I consider?’ section provides the most practical advice about how to turn evidence into action in your classroom or school. An example of this section for metacognition and self-regulation contains the following:
- Which explicit strategies can you teach your students to help them plan, monitor, and evaluate specific aspects of their learning?
- How can you give them opportunities to use these strategies with support, and then independently?
- How can you ensure you set an appropriate level of challenge to develop students’ self-regulation and metacognition in relation to specific learning tasks?
- In the classroom, how can you promote and develop metacognitive talk related to your lesson objectives?
- What professional development is needed to develop your knowledge and understanding of these approaches? Have you considered professional development interventions which have been shown to have an impact in other schools?
The recent changes to the Toolkit have been to the headline figures for metacognition and self-regulation, reading comprehension strategies and mentoring as shown in Table 1. The rest of the approaches in the Toolkit do not have changes to the evidence security and months’ of learning progress. Metacognition and self-regulation has changed from eight months of learning progress to seven months. Mentoring has also decreased from one month of learning progress to an impact of zero months. While for reading comprehension strategies, it has gained an additional month of learning progress moving from 5 months to 6 months. Through the addition of more studies in each of these approaches this has changed the overall effect size and thus the months’ worth of learning progress.
Table 1 Changes to the headline measures in the Toolkit
|Total number of studies (number of studies added)||Changes in cost||Changes in evidence security||Changes in impact||Mean weighted effect size now (previous effect size)||Impact now in months|
(addition of five studies)
|Metacognition and self-regulation||20|
(addition of seven studies)
(addition of three studies)
These changes are relatively minor as can be seen in Table 1. Although the changes for the effect size for metacognition and self-regulation is the largest. This is also the approach that has had the greatest increase in the number of studies added to the meta-meta-analysis, with the addition of seven new studies to the previous 13. This makes it a total of 20 studies that are included to calculate the mean weighted effect size.
Educators are using the Toolkit:
- to inform the research they study with their Professional Learning Communities,
- to introduce new approaches or new practices into their school through using the evidence base of the Toolkit
- in conversations with each other within the staffroom
- in conversations with parents to explain the evidence base of a certain approach.
The Toolkit uses a mean weighted effect size that is then translated into months’ worth of learning progress. The meta-analysis in the study are weighted according to the number of students in the studies, the higher the number the higher the weighting. The months’ worth of learning progress is a translation from the effect size according to the ranges shown in Table 2.
Table 2: The range of effect sizes for months’ worth of progress
|Months impact||Effective size from …||… to||Description|
|0||-0.01||0.01||Very low or no effect|
Source: (Evidence for Learning, 2018a)
Apart from the international evidence base within the Toolkit, each approach has a summary of evidence from Australia and New Zealand (Evidence for Learning in collaboration with Melbourne Graduate School of Education, 2017). This work was completed in partnership with the Melbourne Graduate School of Education at the University of Melbourne. Researchers can add to these summaries through collaboration with Evidence for Learning, Melbourne Graduate School of Education and the Education Endowment Foundation. The Australasian Research Summary on Teaching Assistants has been updated with new research.
Recently new studies have been added to the Toolkit that have changed the headline figures for three approaches including meta-analysis and self-regulation. Metacognition and self-regulation has changed from eight months’ of learning progress to seven months, while reading comprehension has increased by one month of learning progress from six to seven months. The Toolkit uses a mean weighted effect size so that the sample size of the studies within the meta-analysis is taken into account. Educators can use the Toolkit to make evidence-informed decisions at their school. A fitting conclusion is to end with the words from an educator that is using the Toolkit:
Trish Johnstone, Teacher, Kennington Primary School
Bryk, A. S. (2015). 2014 AERA Distinguished Lecture Accelerating How We Learn to Improve. Educational Researcher, 0013189X15621543.
Education Endowment Foundation. (2018). Evidence for Learning Teaching & Learning Toolkit: Education Endowment Foundation. Retrieved from http://evidenceforlearning.org.au/the-toolkit/full-toolkit/
Evidence for Learning. (2018a). About the Toolkit. Retrieved from http://www.evidenceforlearning.org.au/the-toolkit/about/
Evidence for Learning. (2018b). Impact Evaluation Cycle. Retrieved from http://evidenceforlearning.org.au/evidence-informed-educators/impact-evaluation-cycle/
Evidence for Learning in collaboration with Melbourne Graduate School of Education. (2017). Australasian Research Summaries. Retrieved from http://www.evidenceforlearning.org.au/australasian-research-summaries/feedback
Hattie, J. (2015). What works best in education: The politics of collaborative expertise. Open ideas at Pearson, Pearson, London.
Sharples, J. (2013). Evidence for the frontline: A report for the alliance for useful evidence: Alliance for Useful Evidence.