Evaluating the difference CEED has made
By David Pannell (University of Western Australia)
The impacts of CEED’s activities were measured in terms of influence on policy; improvements to decision-making processes and management practices; levels of collaboration, engagement, and translation of the research with governments, industry, the community and other researchers; and outstanding academic performance.
Data were collected for 87 discrete projects that have operated within CEED. Engagement with end users was strong, involving 110 different end-users and stakeholders, 79 from within Australia, and 31 from overseas.
Results were highly heterogeneous. The majority of the observed impact has occurred in a minority of the projects, with 18% of projects being rated as high-impact. However, for almost half of the projects, the potential future increase in impact was assessed as being moderate or high. This reflects the time lags involved in research attempting to influence policy and management. The correlation between impact and academic performance was positive but low.
People close to CEED are well aware that CEED researchers have made many important contributions to environmental policy and management. However, measuring these impacts is notoriously difficult.
Even if environmental policy or management have changed, it can be very difficult to know the extent to which the change can be attributed to research. Policy development is a complex and messy process, with many players involved. And the eventual impacts of policy change on environmental outcomes are often uncertain, unclear and delayed. Maybe that is why quantitative analyses of the impact of research on environmental management, environmental policy and environmental outcomes are rare.
Nevertheless, a small team from UWA took on the task of trying to capture CEED’s impacts, as well as documenting its academic outputs, collaborations and citations. We hope our approach might assist other environmental research networks and centres to measure the influence of their own research efforts.
The CEED impact evaluation collected data on 87 CEED projects and discussed nine of these in detail (see the box on report structure). It found that there was high academic performance in many of CEED’s outputs (see academic impact) and high policy and management impact in some projects, but not all.
Lessons and implications
A number of important lessons and implications were identified in the impact analysis.
There have been many studies on the factors that underpin research impact, and most of them highlight the importance of engagement and good relationships with research users, the quality of communication (see Decision Point #73, Decision Point #74 and Decision Point #105).
However, the evaluation found that just as important was what research is actually done. If research is not providing insight or tools that are actually useful to policy makers or managers, even strong relationships and excellent communications won’t lead to impact.
Therefore, developing a research culture that values impact and considers how it may be achieved prior to the selection of research projects is potentially important. The role of the centre leadership team in this is critical. Embedding impact into the culture of a centre probably happens more effectively if expertise in research evaluation is available internally, either through training or appointments.
A challenge in conducting this analysis was obtaining information related to engagement and impact. There may be merits in institutionalising the collection of impact-related data from early in the life of a new research centre.
In this analysis, there was little correlation between academic performance and impact on policy and management. It should not be presumed that the most impactful projects will be those of greatest academic performance.
Finally, there are often long time lags between commencing research and delivering impact – decades in many cases. Therefore, there is a need to allow the longest possible time lag when assessing impact. On shorter timescales, it may be possible to detect engagement, but not the full impact that will eventually result.
The evaluation of CEED’s impact involved asking CEED researchers what they were working on, what they thought the impact of this work was, and then asking users of that research what they believed the impact was, and to cite the evidence for this belief.
The report itself was divided into six sections.
First, there is a general discussion on the challenges of measuring the benefits of environmental research, with an outline of the conceptual framework that informed our approach.
Second, information about the impacts of 87 projects conducted within CEED is set out.
Third, more detailed information about nine case studies is discussed (these case studies being selected from the larger set of 87 projects). The nine case studies vary widely in the types of environmental issues addressed, type of research, and scale of impact. Evidence presented includes statements by end users of the research.
Fourth, is an assessment of publications, citations and collaborations within the Centre.
Fifth, there is a discussion on the role played by Decision Point (the Centre’s main outreach publication) in building a culture of engagement by environmental researchers, and a culture of using evidence and analysis among environmental managers and policymakers.
The report concludes with a discussion on the lessons and implications arising from the analysis.
More info: David Pannell firstname.lastname@example.org
Reference: Thamo T, T Harold, M Polyakov & D Pannell (2018). Assessment of Engagement and Impact for the ARC Centre of Excellence for Environmental Decisions. University of Western Australia. Note: the final report will be available on the CEED website shortly.