Skip to main content


Understanding effective evaluation of the impact of outreach interventions on access to higher education: phase two

21 August 2017 - 31 December 2018

PI/s in Exeter: Dr Anna Mountford-Zimdars

CI/s in Exeter: Pallavi Banerjee Joanne Moore Professor Debra Myhill

Research partners: Tim Quine (Geography, University of Exeter)

Funding awarded: £ 69,910

Sponsor(s): OFFA (Office for Fair Access)

About the research

This project sets out to determine the efficacy and usability of the Standards of Evaluation Practice. These are a set of useable and transferable standards of evaluation practice which provide guidance about different approaches to evaluation and evidence of which type of outreach interventions can have the greatest impact in different geographic and institutional settings. The project design is mindful that the evaluation tools will be used largely by practitioners rather than evaluation experts, and thus will address whether the tools and guidance sufficiently mediate this expertise on evaluation and data to outreach professionals. Accordingly, we approach this endeavour from the outset aware of the need to promote evaluation where we ‘collect evidence in a structured way to provide a common language for communicating impacts, where the evidence requirements are realistic and proportionate, to generate useful evidence to ensure we are making a positive difference’.

In the current context practitioners are being challenged to understand more about effective widening participation evaluation practices, and to evaluate more effectively and with more consistency across the breadth of institutions within the sector. Frameworks for evaluation of outreach and widening participation have been developed over a number of years. The Phase One work was developed in partnership with universities, and seeks to address tensions between statistical behavioural studies and qualitative data, through the focus on comparative studies. Achieving agreement on evaluation solutions and approaches that are owned by the sector and can operate within a changing landscape is challenging, and relies on access to appropriate data sources and staff with analytical expertise and knowledge. The proposed project will be important in developing practice-based conclusions and guidance about different approaches to evaluation, and useable and transferable standards of evaluation practice which have been tested ‘in the field’ (as well as evidence of the impact of different types of interventions). Improvements in administrative data sources have afforded greater opportunities for evaluations using a robust approach, but there are ethnical implications and control groups may be unreliable. Improved availability of matched administrative data is allowing for more comparative research and it will be important to test the data sources and uses, and the extent to which different approaches provide alternatives to randomised control trials in some circumstances. Collaborative working has been identified as a productive way of working on these issues (eg. HEAT membership). This is a promising development, although it will be important to test whether pooled data allows for the specificity of each activity to be addressed.

Project aims:

  • To further understanding of evaluation practices in outreach activities;
  • To support the development of institutional evaluation practices of their outreach activities;
  • To influence guidance to institutions on data collection and evaluation practice;
  • To develop guidance and practices which can withstand changes to the higher education sector;
  • To provide evidence about the types of outreach intervention that have the most impact in different geographic and institutional contexts.

Key questions the project will address:

  • How effective are the Standards of Evaluation Practice in determining the impact of outreach activities?
  • How useful are the Standards of Evaluation Practice in determining the impact of outreach activities?
  • How well do the Standards of Evaluation Practice and accompanying guidance mediate understanding of evaluation methodologies to outreach professionals?