Show me the evidence - how to shine a light on university access

The response from the former head of admissions at a prestigious university said it all: ‘I would rather support more students than waste money on an evaluation.’ The strong rebuttal came after I suggested that the university evaluate its efforts to attract students from disadvantaged backgrounds.

His view was that we should not divert money away from helping young people. In my mind the reaction revealed that this eminent seat of learning was stuck in the past when it came to its widening participation work.

The problem, you see, is this: we didn’t (and still don’t) know if the university’s programme was actually having any impact on the students taking part. In fact, for all we knew the students were (and continue to be) worse off for their experience.

What’s the impact?

Whether university visits, academic mentoring or financial help, for too long our evaluation of university access work has amounted to simply surveying students to see if they enjoyed their experience. Seldom have we investigated whether a programme has actually had a causal impact. We need to answer that niggling question in the back of every good practitioner’s mind: would the students have gone on to progress, or attain, or develop, as well if the university hadn’t intervened in the way it did?

The lack of robust evidence for what has worked remains the Achilles heel of the widening participation sector. It hastened the end of the national Aimhigher programme. It is scandalous that most work in our universities and colleges aimed at opening doors to students continues to go unevaluated. We spend hundreds of millions of pounds a year on this work; yet we are effectively operating in the dark.

How to build the evidence

So you can imagine my joy on reading HEFCE’s list of recommendations aimed at promoting a more evidence-led approach. It suggests that institutions should consider how they might be able to carry out stronger research evaluations of widening participation interventions (such as randomised trials and studies with comparison groups). It proposes development of an accessible guide to enable practitioners to see which approaches are supported by the best evidence.

We have been developing plans to work with universities to commission these long overdue evaluations. Next month the Sutton Trust will publish the results of a worldwide review of robust research on university access. With the Office for Fair Access we will appoint a researcher to develop the groundwork for future evaluations, agreeing the best methods and metrics to use.

There are many insights from the Trust’s work in developing evidence for teachers. In 2010, the Trust published its pupil premium toolkit for schools summarising research on what approaches worked best for raising the attainment of disadvantaged pupils. The toolkit now underpins the work of the Education Endowment Foundation, and is widely read. The Foundation has subsequently commissioned over 100 evaluations in schools across England.

Our work with teachers points to several lessons for any evidence-led movement for widening participation. You need to build the evidence carefully, piloting approaches before heading into full-blown major trials. A mix of quantitative and qualitative research is required. You have to create genuine partnerships between expert practitioners and researchers undertaking evaluations. Creating evidence of what has worked, meanwhile, is only the first small step in a long journey to develop a professional culture that genuinely acts on evidence.

If we are to embark on this journey, then universities will need to spend some of their access budgets on evaluation. Robust studies are not easy to do. They cost time and money, and can produce unexpected results. But not to seek evidence is akin to burying our heads in the sand, failing the very students we are trying to help.