Evidence for Learning: Four issues schools face when using evidence for improvement

Four issues schools face when using evidence for improvement

Guidance on what are the best approaches to choose from and how to implement them.
Author
E4L
E4L

Authors: Dr Tanya Vaughan and Katie Roberts-Hull

Blog •6 minutes •

School leaders and teachers are constantly looking for ways to improve professional practice. For decades, these efforts mostly involved teachers attending external workshops on the latest trendy topics, with usually limited impact on student achievement. But in many systems, including Australia’s, educators are moving to bring professional learning in-house and ensuring that new practices are based on the evidence of their own students’ learning.

This change has the potential to greatly improve school practice. Many schools are now veterans at reshaping teaching based on evidence and have seen results in improved student outcomes. Yet for many other schools this is a new focus, and they may need guidance on what are the best approaches to choose from and how to implement them.

Learning First and Evidence for Learning have reviewed hundreds of research papers, conducted research trials and seen practice at dozens of schools. Combining our experience with international and local academic evidence, we’ve identified four common issues that schools face when trying to use evidence-based improvement practices:

Solutionitis’ happens when schools are so focused on using evidence’ that they jump to a potential solution without first analysing the students’ learning problem.

Analysing internationalresearch (Jensen, Sonnermann, Roberts-Hull, & Hunter, 2016; Tanya Vaughan & Albers, 2017), we’ve found that an improvement cycle (or impact evaluation cycle) is critical for teacher learning and school improvement. A school improvement cycle starts by analysing a student learning problem; by gathering lots of data points to deeply understand the issue and then to prioritise a specific problem of practice. The goal is to better understand the problem first because that helps identify which possible solutions may be most useful in the next stage of the cycle.

For example, a school interested in evidence-based practice may learn that feedback’ is a high impact approach, and decide to roll out professional learning on how to give better feedback. But this is probably the wrong approach. While better feedback may be helpful, the school has not first identified a clear problem it is trying to solve. Not only might there be a better approach to solving the student learning issue but the school has not identified a specific goal that implementing feedback would address. This means it will be impossible to measure whether the introduction of feedback has made a difference to student learning.

To check whether your school is suffering from solutionitis, ask these questions:

1. What student learning issue are we trying to address by implementing these new practices?

2. Have we spent time analysing multiple types of data on this issue before jumping to the solution?

3. Do we have a clear student learning goal in mind that we can track and review later to make sure this solution has had an impact on it?

After a problem has been analysed, teachers need to determine the next step – how to solve the problem. This is where evidence comes in. Schools need access to clear information about what practices work best. Plain English summaries of evidence, such as those provided in the Teaching & Learning Toolkit by Evidence for Learning (Education Endowment Foundation, 2017) are most useful. Easy-to-digest evidence summaries are critical for this stage of the improvement cycle. For example, a team of Year 1 teachers may have prioritised improving reading skills for a subset of students who are falling behind. As they look for solutions, they need access to reliable information on which instructional approaches for young readers have the strongest evidence, and which approaches are less effective.

The improvement cycle falls short if teachers do not have access to the right information in the right format about how to improve their teaching practice to solve their priority student learning issue.

Not every student learning approach has a perfect evidence base, but teachers need to know what evidence is out there, how strong it is, and examples of how to use it.

A critical part of the improvement cycle is to review and evaluate whether changes made have improved student learning. Even if a practice has perfect evidence behind it, implementation can fall short, and new practices don’t always work perfectly on the first try. Also, existing evidence is often based on research coming from different contexts to your own school or classroom (for example, research based on students of a different age to the students you are teaching). Therefore it is vital to check that the practices are improving student learning in your context. Often a tweak to implementation, or a reconsideration of the new practice, may be needed after the initial evaluation and review.

New approaches must be given the greatest chance to succeed. Learning First and Evidence for Learning are interested in understanding the key parts of the implementation schools need to focus on to ensure a successful change becomes embedded. To help identify the essential ingredients for improving student outcomes, Evidence for Learning commissioned a scoping review of studies of implementation in education.

The summary report identifies four crucial components of implementation:

  1. Fidelity: Did we stay true to the plan, or was the approach altered during implementation?
  2. Dosage: How many hours of teaching, coaching, or otherwise are needed for the greatest chance of success?
  3. Quality of implementation: What is the quality of support provided by teachers and school leaders?
  4. Acceptability: Did stakeholders, including students, accept the relevance and importance of this approach?

One example of implementation can be found in the research evaluation of The Song Room program (conducted by Dr Tanya Vaughan) which is highlighted in Evidence for Learning’s Australasian Research Summaries (see references below). The Song Room uses a range of programs to ensure that children across Australia have access to music and creative arts education. The research describes the four elements of implementation. The amount of time students took part in The Song Room program was found to influence the level of impact on their learning outcomes, demonstrating the importance of dosage. The longer the program participation, the stronger the impact on attendance, grades and NAPLAN results.

The qualitative analysis found that the support of the school leader (quality of implementation) was important to ensuring successful student outcomes. Acceptability was investigated through interviews with students. These produced some of the most insightful and powerful parts of the evaluation. For example, one student described how The Song Room program had reduced bullying at the school. The student pointed to another student within the group and said, You used to bully me before we were both in the same drumming group’. The other responded, Yes I did, and now I don’t anymore’. A critical part of the program’s success at this school was making students the leaders of their groups. Given a leadership opportunity, the students stepped up to the challenge and the result was a reduction in bullying.

The growing focus on evidence-based school improvement is incredibly promising. Many schools are only at the beginning stages of implementing new approaches, and could benefit from considering evidence through the lens of an improvement cycle. If schools get the improvement process right by using the cycle, they will be more likely to have success with implementing evidence-based approaches.

Caldwell, B. J., & Vaughan, T. (2012). Transforming Education through The Arts. London and New York: Routledge.

Education Endowment Foundation. (2017). Evidence for Learning Teaching & Learning Toolkit: Education Endownment Foundation. Retrieved from http://evidenceforlearning.org.au/the-toolkit/

Jensen, B., Sonnermann, J., Roberts-Hull, K., & Hunter, A. (2016). Beyond PD: Teacher Professional Learning in High-Performing Systems, Australian Edition. Retrieved from: https://learningfirst.com/wp‑c…

Kivel, L. (2015). The Problem with Solutions. Carnegie Commons Blog. Carnegie Foundation for the Advancement of Teaching. https://www.carnegiefoundation.org/blog/the-problem-with-solutions/

Vaughan, T., & Albers, B. (2017, 20 July). Research to practice – implementation in education. ACER Teacher Magazine.

Vaughan, T., & Caldwell, B. J. (2014). Improving literacy through the Arts. In G. Barton (Ed.), Literacy and the arts: exploring theory and practice. Dordrecht: Springer.

Vaughan, T., Harris, J., & Caldwell, B. J. (2011). Bridging the Gap in School Achievement through the Arts: Summary report Retrieved from http://www.songroom.org.au/wp-content/uploads/2013/06/Bridging-the-Gap-in-School-Achievement-through-the-Arts.pdf