Evidence for Learning today released a report of its latest trial on a maths program – QuickSmart Numeracy, adding to earlier reports aimed at improving the quality of evidence teachers can draw on when they are helping students in the classroom.
QuickSmart is designed to help students become ‘automatic’ in basic maths skills and then apply them in more advanced maths tasks. Students attend 90 QuickSmart sessions in 30 weeks in addition to their regular maths classes.
The trial found that, on average, QuickSmart had one month’s additional impact on maths achievement compared to students who only participated in regular maths classes, and there appeared to be greater gains for students attending more QuickSmart sessions. However, the results were not statistically significant so need to be treated with some caution.
“Improving the use of evidence is one way we can increase the learning of all students, but it has particular significance for students who are struggling.”
Evidence for Learning was established to fill a gap in Australia’s education system in the production, sharing and use of evidence by teachers and school leaders. It identifies promising educational programs and then engages independent evaluators to run randomised controlled trials (RCT) in which half the students are randomly assigned to the program while the other half get regular classroom instruction.
While this type of research has been common in healthcare for the last 30 years, it is increasingly used in other fields, like education to objectively understand the extra benefit of different practices.
Evidence for Learning publicly reports the findings of all trials, regardless of the result to ensure transparency in the process.
SiMERR National Research Centre, the not-for-profit developer of QuickSmart based at the University of New England had built evidence of the program’s benefits since the program commenced in 2001 and agreed to Evidence for Learning’s request to carry out this RCT.
“SiMERR deserves enormous credit for agreeing to participate in a trial of this nature and to open themselves up to a level of visibility that is very rare in education research.”
A total of 23 Sydney Catholic Schools joined the trial to deliver QuickSmart throughout 2017. The Teachers and Teaching Centre at the University of Newcastle conducted the independent evaluation and analysed the results for all students at the start and end of the program and six months after the final QuickSmart session.
QuickSmart recommends that students participate in 90% or more of the 90 sessions to achieve maximum benefit. However, in this trial only 12% of Primary students and no Secondary students achieved this level of attendance.
There was strong evidence that QuickSmart improved Primary school students’ interest and confidence in maths. Students who achieve higher levels of participation appear to have achieved greater improvements, but the number of students in this group was too small to say conclusively. Further study of this group is warranted. The direct costs of implementing QuickSmart was also found to be very low relative to other approaches.
Deeble also said:
“This type of research commonly shows smaller gains than developers have seen previously. Our UK partner, the Education Endowment Foundation have run more than 100 of these experiments. Only one out of every five showed any additional benefit, three show no significant difference and one is actually worse than ‘business as usual’.”
“The reasons for the findings in this trial and the implications for future research into QuickSmart are explored in our report. We welcome further discussion of the issues identified as part of a collective endeavor to build a stronger evidence base in Australian education.”
The full findings and ‘practitioner-friendly’ resources are available.