Blog post
Twenty years of methods use in education research: Are we moving in the right direction?
½¿É«µ¼º½ launched its ‘State of the Discipline’ initiative in 2019 with the aim of offering ‘a clear, comprehensive account of the state of education as an academic discipline’. As part of this initiative, ½¿É«µ¼º½ commissioned five research projects whose findings were intended to provide research evidence to ‘empower [stakeholders] in advocacy for education research and education researchers’.
One of those projects was a large-scale survey of education researchers’ work, experiences and identities carried out by the University of Warwick (Morris et al., 2023). In addition to inviting respondents to share their experiences of being education researchers in UK higher education (HE), the survey also explored their research motivations, activities and expertise.
The data from this survey also provided the opportunity to explore how methods use among HE-based education researchers has changed over time. To do this we compared the findings with a similar study that was carried out more than 20 years ago as part of the ESRC-funded Teaching and Learning Research Programme (TLRP).
Some readers may remember the TLRP. It ran between 2000 and 2011 and was, at the time, the ESRC’s largest ever education research programme. Its purpose was to support high-quality research for policy and practice, with a focus on developing expertise to ‘enhance capacity’ for research on teaching and learning. This formed a strand of work undertaken by the Research Capacity Building Network (RCBN), a key output of which was a survey of hundreds of education researchers to identify current expertise in research and future training needs (Gorard et al., 2004).
Both the ½¿É«µ¼º½ and RCBN surveys allow us to consider how the research methods used by education researchers have varied over a 20-year period and so contribute to debates about purpose, quality and methodology within the discipline.
‘Although present-day education research is characterised by an eclectic and diverse range of approaches for some, current researchers report using fewer methods than those surveyed 20 years ago.’
Our findings show that although present-day education research is characterised by an eclectic and diverse range of approaches for some, current researchers report using fewer methods than those surveyed 20 years ago. The findings also suggest an increased polarisation in the types of methods used. In 2002 there were four ‘types’ of researcher (see table 1). The most common type used largely numeric data, often in addition to non-numeric methods, but there were also those using mainly non-numeric data, mixed methods researchers using both almost equally, and those working largely with just a single method, usually interviews. By 2022, only two types of researchers were identified: a larger group using mainly or exclusively non-numeric data, and a smaller group of researchers who were predominantly numeric but, as in 2002, tended to also use a wider range of methods. The mixed and other single method researchers that were apparent in the RCBN analysis were absent from the ½¿É«µ¼º½ data, suggesting increased polarisation between those who report mainly using numbers in their research and a larger group of researchers who mostly or entirely avoid them. We summarise these patterns in table 1.
Table 1: Percentage of respondents clustered in each researcher ‘type’
|
RCBN 2002 |
½¿É«µ¼º½ 2022 |
||
|
Methods cluster |
% |
Methods cluster |
% |
|
Numeric focused |
38 |
Non-numeric |
65 |
|
Mixed methods |
19 |
Numeric focused |
35 |
|
Non-numeric |
20 |
|
|
|
Mono-method |
24 |
|
|
This apparent narrowing and polarisation of methods over the past 20 years is concerning, especially considering the efforts of funding councils, such as through the Q-step programme, to increase expertise in so-called ‘quantitative’ skills. It may be the case that rather than building skills capacity among those new to ‘quantitative’ research, such programmes have instead, at least in education, served to enhance and develop the statistical skills of the already committed. By focusing on advanced training for a minority, these initiatives have arguably been less successful in equipping the wider research community to interpret, apply and integrate quantitative approaches in their practice. This is not to suggest that there is a dearth of ‘quantitative’ skills within education research more generally. On the contrary, where these skills exist, they appear to be relatively well embedded and integrated in projects with work of other kinds. Nor does this downplay the need for support in all areas of research methods: most respondents to the ½¿É«µ¼º½ survey wanted to develop their methodological expertise; only a minority reported that their own methods training was high quality.
To what extent are these findings an issue for education research? Robust high-quality evidence, regardless of method, can be used to help understand education better, and so help create better and fairer systems for all (Perry & Morris, 2023). However, the findings from our research suggest that in education, we may be making only partial use of our collective methodological toolkit with implications for the kinds of questions that can be addressed, the scope and quality of research, and the type of impact that it has.
This blog post is based on the article, , published in the British Educational Research Journal (BERJ) .
References
Gorard, S., Rushforth, K., & Taylor, C., (2004). Is there a shortage of quantitative work in education research?. Oxford Review of Education, 30(3), 371–395.
Morris, R., Perry, T., Smith, E., & Pilgrim-Brown, J., (2023). Education: The state of the discipline. A survey of education researchers’ work, experiences and identities. British Educational Research Association. /publication/education-the-state-of-the-discipline-survey-of-education-researchers
Perry, T., & Morris, R., (2023). A critical guide to evidence-informed education. Open University Press