613 252 8642 [email protected]

Using Evidence

Making Evaluation Useful and Useable.

Using Evidence is a research and evaluation consultancy that focuses on development and research evaluation. Using Evidence is concerned not only with high quality research, but in exploring ways to enhance the use of results.

What we do

Evaluation Methods and Research

I have had a strong focus on evaluation methods for many years because I think we need new approaches that take account of complexity and that also look at systems not only at projects and programs. I worked with colleagues on the development of Outcome Mapping as well as an approach to Organizational Assessment. Both of these methods integrate context directly into the assessment approach and treat it as a key variable. I have also worked with a team in an action research project on the development of an approach to Evaluating Capacity Building. I worked in collaboration with ORS Impact to develop an approach to evaluating advocacy capacity. We successfully piloted this 360 approach in a number of Bill and Melinda Gates Foundation projects. In addition, I also worked with the Foundation’s Maternal, Neonatal and Child Health team on an approach to evaluating implementation research.

Quality Assurance

Major evaluations can benefit from an external perspective to review the design and draft report. I have carried out quality assessments for Universalia Management Group on evaluations they have conducted for the World Food Program, and for IUCN on Value for Money assessments. I provided quality assurance on the design of the evaluation framework for the learning component of the Global Education Fund. I am currently reviewing an evaluation at the Green Climate Fund. Like many of my evaluation colleagues, I also carry out pro bono peer reviews for a number of evaluation journals.

Enhancing the Use of Evidence

Evidence from research and evaluation needs to be used if it is going to contribute to social improvement. Too often we focus on the methods and the study itself and forget about use until the very end. Use needs to be thought about from the beginning. This focus was reinforced by a study I did for the International Development Research Centre in which we looked at how the research the Centre supports influences public policy – Knowledge to Policy. The findings of that study on the mechanisms of influence has been an important influence on my work over the past decade. I was involved for some years with the Knowledge Sector Initiative in Indonesia where we worked across the policy system to strengthen both supply and demand for policy evidence and its use, as well as working on the legislative and regulatory barriers that inhibit the effective conduct and use of evidence in policy processes. I worked with the Agricultural Science and Technology Indicators program of the International Food Policy Research Institute on a project to enhance the use of evidence gathered through this program, particularly at the national level. Currently I am working with colleagues on research on ‘pathways to the uptake of research’, for Elrha, a humanitarian research agency in the UK. I am also working with colleagues at Witwatersrand University and UNESCO on a proposal for a project that brings futures thinking to evaluation.

Evaluation Systems and Frameworks

Many organizations struggle to develop effective and use-oriented evaluation systems that help the organization learn and improve. While at the International Development Research Centre (IDRC) I was involved in the design of evaluation systems and processes that helped the organization and its programs learn from experience. I designed and implemented the first several rounds of program review at the Centre, worked with a cross Centre team to revise our project completion reporting process, and worked with programs on building evaluation of their strategies. Since leaving IDRC, I have worked with the MasterCard Foundation providing support on their evaluation system and I worked with a team to support the development of an evaluation and learning system at the Rideau Hall Foundation in Canada. At the systems level, I am exploring how evaluation can look at more than an organization or a program and take account of whole system change. I have written on this in a working paper published by the Knowledge Sector Initiative and intend to develop further work in this area. I am currently working with the International Confederation of Midwives on building an organisation-wide evaluation and learning system.

Evaluation

I am actively involved in evaluations as well. While at IDRC I was involved in a wide range of evaluations. More recently I have been involved in organizational assessment of UNWomen (regional architecture) in collaboration with EnCompass, and at the Rideau Hall Foundation, with Cindy Weeks Consulting. With Global Affairs Canada, I integrated a Principles-Focused approach to a country program evaluation which GAC has reused in other country program evaluations. I am supporting Sambodhi Research and Communications with two evaluations of maternal and child health programs in India. I support itad in the evaluation of the Global Challenges Research Fund.

Evaluation Capacity

Evaluation should leave behind improved capacity to do evaluation. For many years, I conducted training around the world in organizational assessment and Outcome Mapping. More recently I have been working with the UNESCO Regional Office for Sciences in Latin America and the Caribbean, on a Massive Online Open Course on “Inequities in Latin America and the Caribbean. Research and policy for social transformations”, delivering three modules on evaluation. The course was piloted in 2017 and is running in both Spanish and English in 2018. I published a chapter in the June 2017 issue of New Directions for Evaluation, ‘Building Evaluation Capacity to Address Problems of Equity.’ I am working with colleagues on thinking through how to address issues of culture in evaluation and how to promote a more active integration of culture into evaluation thinking and methods. I continue to deliver Outcome Mapping Workshops periodically, among others for the Higher Education department of the Ministry of Research, Technology and Higher Education in Indonesia, and to the Atlantic team of the First Nations and Inuit Health Branch (FNIHB) of Indigenous Services Canada. Over the past three years (2018-2020), I have provided evaluation capacity support to projects of the Ontario Brain Institute.

Featured Publications

Fred Carden is the author or co-author of 8 books published in a dozen languages, as well as numerous chapters, articles and presentations.

Evaluating research co-production: protocol for the Research Quality Plus for Co-Production (RQ+ 4 Co-Pro) framework

Evaluating research co-production: protocol for!the!Research Quality Plus for!Co-Production (RQ+ 4 Co-Pro) frameworkResearch co-production is an umbrella term used to describe research users and researchers work- ing together to generate knowledge. Research co-production is used to create knowledge that is relevant to current challenges and to increase uptake of that knowledge into practice, programs, products, and/or policy. Yet, rigorous theories and methods to assess the quality of co-production are limited. Here we describe a framework for assessing the quality of research co-production—Research Quality Plus for Co-Production (RQ+ 4 Co-Pro)—and outline our field test of this approach. Download Complete Article – 1.9MB PDF

 

 

Evaluating the quality of research co-production: Research Quality Plus for Co-Production (RQ + 4 Co-Pro)

This article in Health Policy and Systems reports on the application of the Research Quality Plus methodology for assessing the quality of development research in an application to co-production research. Projects of the Integrated Knowledge Translation Research Network participated in this research. Download PDF, 2MB

 

 

Informing Advocacy and Communications Capacity Building Efforts

This assessment framework aimed at strengthening advocacy and communications capacity was developed and validated through a pilot program with 19 grantees of the Bill & Melinda Gates Foundation, spread across 11 countries of Africa, Asia and South Asia. The brief describes the methodology, presents the tools, and includes insights from initial implementation that might be useful to others. Published by ORS Impact in Seattle. Download PDF, 2.2MB

Knowledge to Policy

Making the Most of Development Research The inability of development agencies to understand and improve the performance of the organizations they support continues to impede progress in the developing world, even after a decade of reforms. Strengthening the institutions that receive those grants and loans — including government ministries and executing agencies as well as nongovernmental organizations — has become the key to improving the efficiency and effectiveness of development assistance. Download Complete Article – 2.7MB PDF

Blogs

Whose knowledge matters?
Evaluation for Development, 3 August 2018

Learning about learning in an adaptive program
With Arnaldo Pellini. Better Evaluation, 21 March 2017.

Indonesia’s knowledge sector is catching up, but a large gap persists.
With Arnaldo Pellini and Helen Tilley. The Conversation. 16 November 2016.

Science and Research in the Public Interest – the Role of National Science Academies.
Research to Action. 10 November 2016.

Remarks to the Indonesian Young Academy of Scientists – ALMI
AIPI Silver Jubilee 29 May 2015

Whose Development Results Count?
Norrag News. August 2, 2012

Videos

April 7th, 2020. The role of institutions for transformation in the global South 
https://bluemarbleeval.org/latest/role-institutions-transformation-global-south

2017 & 2018. UNESCO Montevideo. MOOC (Massive Open On-line Course).
Desigualdades en América Latina y el Caribe: Investigación, Políticas, y Gestión para las Transformaciones Sociales.

Module 1: Experiences from the Field: https://www.futurelearn.com/courses/desigualdades-america-latina/1/steps/266622

Module 3: Evaluating for Equity.
https://www.futurelearn.com/courses/desigualdades-america-latina/1/steps/275690#fl-comments

Module 4: From Knowledge to Policy
https://www.futurelearn.com/courses/desigualdades-america-latina/1/steps/276927

2016. Rethinking Research Policies and Practices. (at 43:30). Centre for Innovation, Policy and Governance, Jakarta.

2012. Emerging Practices in Evaluating Policy Influence. My M&E eLearning Series on Emerging Practices in Development Evaluation.

2010. Development is Not an Intervention. Presentation to the Virtual Conference on Evaluation, Witwatersrand University.

2009. Evidence-Based Policy Making. IFPRI, Washington. DC.

About Fred

Fred is an evaluation specialist with over 30 years’ experience in evaluating development research and programming, and the development of evaluation methodologies. In addition to his focus on methods, Fred has interest and expertise in the use of evidence in public policy. In 2015 Fred established Using Evidence Inc., a global consultancy that focuses on efforts to improve the use of research and evaluative evidence in decision making.

Prior to establishing Using Evidence, Fred spent two years in Jakarta as Lead Technical Advisor to the Knowledge Sector Initiative (KSI), a 15-year Australian funded project to support the knowledge to policy cycle in Indonesia.

Before joining KSI, Fred was Director of Evaluation at the International Development Research Centre in Canada, where he worked from 1993-2013. He is the author and co-author of numerous publications including Local Knowledge Matters, Knowledge to Policy: Making the most of development research, Outcome Mapping, and Enhancing Organizational Performance. He has published articles on other evaluation challenges, such as evaluating equity, and strategy evaluation. He continues to write, with primary interests in the use of evidence in public policy, implementation evaluation, and the evaluation of change at the macro level. See Publications for a complete list. 

Fred’s current work includes research on the use of agricultural research evidence at the national level in Africa, evaluating implementation research, exploring the implications of the Fourth Industrial Revolution in middle income countries, and integrating learning and evaluative thinking more effectively into organizations.

Fred serves on the Board of the Partnership for Economic Policy and is an Advisor to he Doing Research project of the Global Development Network. He sits on the editorial boards of the American Journal of Evaluation, the Journal of Evidence and Policy, and New Directions for Evaluation.

Fred holds a PhD from the University of Montreal (1990) and a Fellowship in Sustainability Science at Harvard (2008). He lives in Ottawa, Canada.