English French German Italian Portuguese Russian Spanish
Presenting Results Detailed Report

Your Detailed Evaluation Report

After you have decided who you report to and in what format, you should create a detailed evaluation report that addresses that responds to all of your evaluation questions.  You can then take the detailed report and extract summary information for your relevant audience and massage the report into the most appropriate format.

Your detailed report should include the following section headings and content.  To help you with your detailed evaluation report, you can follow our template below.  Additionally, a detailed final evaluation report for the Global Climate Change Alliance: Pacific Small Island States (GCCA: PSIS) evaluation report was produced by PREA and published by SPC for downloading to share findings, lessons and best practices.  Whilst the struture of this report is slightly different to the template below, you will find that all the key sections are covered.

Cover Page & Title

Your title should be descriptive of your project and you may even want to relate it to your project’s overall goals.
An appropriate image will add visual appeal to your report.

 

Executive summary

This is summary of the main findings, lessons, and recommendations from your evaluation. Some people, depending on how busy they are, will only read the executive summary. It should not be longer than two pages.

Introduction

  • Overview of project and its goals
  • Key stakeholders and target audience
  • Program Logic

This should include an overview of the project that is being evaluated, including the timeframe, main stakeholders, and project goals.  It is good to provide a program logic that outlines what you sought to achieve and what you did along the way. You may want to describe in greater detail particular activities were critical in delivering outcomes.

Evaluation Framework

  • Purpose of the evaluation
  • Key evaluation questions
  • Evaluation team
  • Evaluation method (including limitations)

You should also outline the purpose of the evaluation, including the evaluation audience and what they want to know. This includes highlighting the key evaluation questions. You may want to include the full monitoring and evaluation plan as an appendix. It is important to note who made up the evaluation team. You should also provide an overview of the evaluation method (you can link this back to the M&E plan in the appendix) and any limitations in the methodology. You may want to use a table that highlights the quantitative and qualitative methods used as part of the evaluation.

Evaluation Findings

  • Key Evaluation Question 1
  • Key Evaluation Question 2
  • Key Evaluation Question 3 etc.

A good way to present your evaluation findings is to use your key evaluation questions as the main sub-headings (eg. Was the delivery model effective in changing participants’ behaviours?).


You would then use the information collected through your monitoring to make a judgement and answer the key question. Remember here that you do not want to simply present information, but rather interpret the information and make a value judgement. Use graphics where appropriate, and remember that you do not have to present all the information you have collected. You may present some of the information from your monitoring in an appendix.

Conclusion and Recommendations

This is where you may want to do a high level summary of the success and lessons of your project based on your evaluation findings. You may want to also communicate how the evaluation findings will be used (in terms of information future projects, or changes in policy etc). You should also make a list of key recommendations (which are also presented in the executive summary).

References

Provide details of any other publications or sources of information that you have used in your report.

Appendices

This is where you provide detailed information that some of your audience members may want to refer to. This includes your full M&E plan, questionnaires that were used, detailed results and information, statistical analyses etc.

 

 
Donate Now

Please make a donation to upgrade the Evaluation Toolbox.

The Evaluation Toolbox is maintained by Damien Sweeney and Martin Pritchard from PREA as their in-kind contribution back to the community. The Toolbox now needs several weeks of work to review and upgrade all the contents and add new content for you. Work has begun and we are seeking your donation (big or small) to help support this major upgrade. Email us to indicate what you want to be included in the Toolbox.

case study

What is the Toolbox?

The toolbox aims to provide a one-stop-site for the evaluation of community sustainability engagement projects that aim to change household behaviours. Through the toolbox you can learn how to conduct your own evaluation of a behaviour change project using the guides and templates provided.

see more...

Why Evaluate?

Conducting an evaluation is considered good practice in managing a project.

The monitoring phase of project evaluation allows us to track progress and identify issues early during implementation, thus providing and opportunity to take corrective action or make proactive improvements as required.

see more...

 

DSE
City of Whitehorse City of Whitehorse
Accord