Whatever it is called across the globe, soccer (or football or the round ball game), is affectionally known as the ‘world game’ because it is played in so many different countries. With the same basic rules and equipment (a field, a ball and 11 players) it is accessible to almost everyone. Fans can celebrate great players and great feats no matter where they come from – and so greatness in one country can spur innovation in another.
Without torturing the analogy or overclaiming on the impact of Evidence for Learning’s (E4L) work, there are parallels with the world of evidence in education. There is a shared set of rules and protocols (around types of research and evidence); each country has its own traits and style of play (the context created by our geography, populations, systems and politics); and we all get better when we observe and test ourselves against others (continuous improvement). E4L is seeking to be the best we can for our learners (to ‘improve the game’).
This blog is about an international ‘tournament’ held in London in late October 2017 and the relationships and opportunities that are flowing from it.
The Education Endowment Foundation (EEF) hosted and chaired its first Global Partners Conference in London on 28-29 October 2017 with representatives from 10 countries including Australia, Chile, England, Ireland, Japan, Jordan, New Zealand, Scotland, Spain and Wales.
The EEF is an English charity that was set up seven years ago with a grant from the English government. It aims to reduce the attainment gap for learners from disadvantaged backgrounds through the creation and promotion of better evidence on effective school programs. The EEF is a founding partner of Evidence for Learning and is the creator and custodian of the research content on which the Australian Teaching & Learning Toolkit is based.
The conference brought together organisations involved in the mobilisation of education evidence in their respective countries. Some are groups within government (either departments of education or agencies of government), some are purpose built independent non-profits and some are philanthropic projects.
All share a common purpose - to improve educational outcomes for learners by moving research and evidence from the shelf and into the hands of practitioners.
Over the course of the two days, some fascinating topics were covered, that will inform and animate Evidence for Learning’s work in 2018. A brief synopsis on each is below:
Same evidence, different Toolkits
Different countries are customising the global evidence in the Teaching & Learning Toolkit (the Toolkit) to make it relevant for the policy and practice content in their country. There were great presentations and insights from colleagues in Latin America (see for a Spanish language Toolkit experience), Scotland (see for the Scottish take on the content) and Australia’s own version.
Policies for evidence
Attendees enjoyed presentations and discussions from other participants on how their respective governments and philanthropists can create conditions that support (or hinder) practitioners building, sharing and using evidence to improve outcomes. Perspectives came from places as diverse as Brazil, Chile, Japan, Jordan, New Zealand and Wales. It was especially interesting to learn how different stages of development and infrastructure shape the topics of interest and the ways of improving education.
For example, colleagues in Latin America are more interested in the evidence on whole system policy questions (such as teacher pay levels or school choice) whereas in the UK the focus is on effective practices in school.
Participants helped each other consider effective ways of promoting our common evidence agenda with the stakeholders and influencers in the education sector in their country.
Attendees discussed a change in the way research studies will make their way into the Toolkit. Currently a research study needs to be included in a systematic review or meta-analysis (a study of studies) on the relevant topic in order to be included in the Toolkit calculations. Over the next few years this will change to individual studies being coded or tagged on more than one relevant topic and then being able to be searched for dynamically by a user. So rather than a study looking at the academic performance from a well-being program targeting 10-12-year-old boys from low socioeconomic backgrounds appearing only in the Toolkit approach on ‘behaviour interventions’, the study will appear on any search that included ‘behaviour’, ‘social and emotional learning’, ‘boys’ education’ ‘primary education’ and ‘low SES’. This moves the evidence beyond basic ‘what works’ knowledge to the more valuable ‘what works, for whom and under what conditions’ – questions that practitioners are most interested in.
Cross border trials
Currently EEF partners share a common approach to commissioning research, trial design methods, types of data collected, and publishing standards. Global partners do this on trials of programs that are conducted exclusively within one country.
Participants explored the potential to conduct a trial of the same type, on the same program or practice in multiple countries.
Ultimately, it was found that the challenges of such a trial are too great presently to seriously consider. But new areas for deeper collaboration and knowledge sharing between global partners were identified which will improve our individual, separate research efforts. And a shared promise was made to consider the topic at the next global partners meeting.
I’m grateful for the opportunity to meet and learn from our colleagues around the world. Thank you to the great team at EEF for bringing us together and supporting our deeper collaboration. Every new country that joins the network, creates a new version of the Toolkit, commissions a new research trial or creates a new way of translating and promoting research to their educators, is adding a piece to complete the jigsaw. It increases the potential of our work to improve the learning for young people in our own country.
To return to the football analogy, the quality of our own home ground game is improved by seeing how others train and play, and testing those approaches. As for E4L, we’ve got the boots on and laced up, and we are ready to try out some new skills in the 2018 season!
Matthew Deeble is the Director of Evidence for Learning. He is responsible for Evidence for Learning's strategy and system engagement.