Best Practice

CPD: How to unlock education research

Unpacking research findings can be challenging. Dr Fiona Aubrey-Smith and Professor Peter Twining offer teachers and CPD leaders some practical tips about how to source and interpret the latest research efficiently and effectively and discusses some of the common flaws to look out for

 

As a profession we are getting better at using research evidence to inform our decision-making, both individually and collectively. However, just over a year ago, the National Foundation for Educational Research and the Education Endowment Foundation (EEF, 2019) undertook research which suggested that there is still a long way to go (see also Nelson & Walker, 2019).

Part of a school becoming an effective professional learning environment is engaging meaningfully with available research and embedding specific types of strategic thinking and evaluative focus into practice (Twining & Henry, 2014). While many teachers and leaders are now more pro-actively engaging with academic research through formal studies (such as Master’s programmes), others will not yet have had the opportunity to be coached through how to source and unpack robust research specific to their own practice. Whether you are a classroom practitioner or a leader supporting others here are some tips for how to source and interpret the latest research findings.

 

Left or right?

Some research says go right. Some research says go left. So how do you know which way to go?

For every research article that argues for a particular approach, there will be another article somewhere which argues that a different approach is better. Your task is to ask:

  • Who conducted the research? What biases or motivations might they have had?
  • What evidence are their arguments based upon?
    • Have they explored findings from previous research? Do they consider and weigh up contrasting studies or might they have only included research that supports their argument? Have they excluded all the studies that don’t include statistical data? (Many systematic reviews do this.)
    • Have they generated their own data? If so, how appropriate was the method used to generate that data? (What people say they do is often different to what they actually do.)
    • Do they clearly explain how they analysed the data? Have they probed below the surface to explore different interpretations? Have they considered all of their data or cherry-picked bits that fit the story they are telling?
    • Would you have come to the same opinions based on the data they presented? Are the conclusions based on the data that they actually collected? (You would be surprised how often research conclusions go beyond what the data analysis shows.)
  • How relatable is the evidence? Do they discuss how their findings relate to findings in the existing literature? Do their findings align or contradict with what is already known, and if it contradicts previous research, do they explain and justify why their conclusions are more credible?
  • How relevant is the research to your situation? To what extent is the context of the research similar or different to your own context? (Just because something works in one setting, doesn’t mean it will work in yours.) Are there insights that might be relevant or challenge your thinking even if their context is different to yours?

 

Demand more than graphs and percentages

 

The detail matters most. Headlines often follow research which is based on numerical data which means that it often becomes better known and more publicised. Numerical findings are easy to communicate – through graphs with lines that go up or down, and percentages that sound good or bad. They are quick and easy to skim-read, and they are often compelling – statistics seem unquestionable.

However, statistical techniques are often misused, meaning their findings are worthless. Equally importantly, numerical data often over-simplifies very complex contexts, and while appearing objective can often be biased, unreliable or invalid (e.g. surveys based on self-reporting creates responses which are highly subjective – “do you agree”, or rate on a scale of 1-5).

A good test of these forms of evidence is to ask yourself – how would I answer that same question if I was asked? The chances are that you would give a more detailed explanation or would want to qualify your response because you instinctively know that the detail matters or that it is more complex than the question seems to assume. So seek out studies that use that same level of detail in their data generation and analysis – studies that attempt to understand what is happening and why through detailed analysis of what people say and do (often based on interviews, observations and case studies).

You can use a mix of findings based on quantitative (numerical) and qualitative (non-numerical) data to inform your thinking – but don’t assume that the numerical data is more credible or objective than the non-numerical data.

 

Correlation can just be a coincidence

 

Where research makes claims about an approach, strategy, intervention, product or method correlating to improved outcomes, bear in mind that correlation is not the same as causation. Ask: what else is going on in the study?

A great example is when someone claims that those who used a particular strategy saw an increase in assessment outcomes or motivation. These increases may or may not have anything to do with the strategy itself – but might instead simply be a result of those involved being more aware of their own actions and therefore more precise and diligent. Or the schools who used those strategies may be doing so as part of a broader programme of improvement interventions.

 

A critical eye...

 

Trust your judgement and read the research with a critical eye. Published research can provide useful evidence to inform your practice. So seek it out and do not be put off if at first it seems impenetrable. As with anything, as you become more familiar with the jargon researchers use it will become less daunting.

Remember that as a professional educator you have deep understanding of teaching and learning – use that to help you read the research with a critical eye. Trust your judgement – be prepared to learn from it, but also question it, and discuss it with colleagues. Then draw your own conclusions about how credible it is and how it should inform your practice.

  • Peter Twining is professor of education (innovation in schooling and educational technology) at the University of Newcastle, Australia, having formerly been professor of education at the Open University in the UK. He has also been a primary school teacher, initial teacher educator, head of the department of education at the Open University, and the co-director of the Centre for Research in Education and Educational Technology. Follow him @PeterT
  • Dr Fiona Aubrey-Smith supports schools and trusts with professional learning, education research and strategic planning. She is the founder of One Life Learning, an associate lecturer at the Open University, a founding fellow of the Chartered College of Teaching, and sits on the board of a number of multi-academy and charitable trusts. Read her previous articles for Headteacher Update via http://bit.ly/htu-aubrey-smith and follow her @FionaAS

 

Further information & resources

 

  • EEF: Putting evidence to work: A school’s guide to implementation, December 2019: http://bit.ly/2EzO0Cz
  • Headteacher Update: Every teacher a researcher, Fiona Aubrey-Smith, March 2018: https://bit.ly/3xMGlrb
  • Headteacher Update: Evidence to Action: The 7 Es of professional development, Fiona Aubrey-Smith, March 2021: https://bit.ly/3dsivK7
  • Headteacher Update: CPD that makes a difference: a checklist for schools, Fiona Aubrey-Smith, July 2021: https://bit.ly/3g5EoQJ
  • Nelson & Walker: Evidence-informed approaches to teaching – where are we now? NFER, May 2019: https://bit.ly/3m75Ile
  • Twining & Henry Enhancing teaching in English schools: Vital lessons, World Journal of Education (4,2), 2014: https://bit.ly/3xQHbmW

Teacher-friendly research resources