Reporting and Sharing

This stage of the evaluation process is often neglected, frequently not being considered in any detail until after the data have been analysed. While reporting is generally considered to be an end point to an evaluation, it is often helpful to build some aspect of reporting into earlier stages. This needs to be considered carefully and balanced against the need for the practice to be ‘uncontaminated’ by the evaluation, but interim reporting can be a valuable means of maintaining participant motivation for the evaluation process. Some evaluation designs, for example process evaluation or formative action research, have regular reporting and feedback as an integral part of the evaluation process.

Academic researchers are required to publish their work in books and journals, but in-service evaluation is quite different in this respect. There is a specific audience awaiting the results and it is important that these results are presented in a way that is useful to that audience. This may require a different format to the traditional ‘academic’ report, such as a simple written summary report or an ‘in person’ presentation to a group. This need for results to be presented in an immediate and accessible way is a strength of the evaluative approach, but at the same time means that the results of an exercise are likely to reach a relatively small number of people.
This is particularly true within the probation services, where much useful evaluation about probation practice is not published outside the local service in which it occurs. This point was made by Chapman and Hough (1998 p.107) and is attested by the wealth of material provided for the survey of evaluations undertaken for this publication, which had not previously been widely available. NPRIE provides a forum for some sharing of this material, frequently on an informal basis, though more and more services are now making copies of evaluation reports available to other services through the ACOP Bulletin. In conjunction with the publication of this handbook, the three main probation service libraries have now agreed to establish and maintain a collection of evaluation reports (see 3.6.3). However, there is a need for evaluators of probation practice to consider other routes for wider publication, such as practice journals, and a need for probation managers to encourage and facilitate evaluators to do this.
At the same time, it is important to consider issues of ‘ownership’ of the material, and the rights of different parties to see and possibly require changes before wider publication. Where the evaluator is externally contracted to undertake evaluation these issues should be included in the terms of the contract. Where the evaluator is an employee of the organisation for which the evaluation is undertaken these issues should be negotiated during the planning stages of the evaluation, and agreed in writing. In some circumstances contracts of employment will include reference to publication. See Chapman and Hough (1998 p. 106) on the right to publish.

Another important consideration in relation to publication is commitments made to participants in the early stages of the work. Where participants were promised copies of a report, or the right to comment on drafts before publication, these should be budgeted for in both time and cost, and promises kept. Where assurances of confidentiality have been given reports should be double-checked to ensure that individuals cannot be identified from the material presented.
The reporting stage is the point at which all the work on the evaluation comes together and the evaluator’s reflections, understanding and interpretation of the results are presented to others. Just as with all other stages of the evaluation process, this one too should be guided by consideration of the original questions and purposes of the evaluation. The purpose of reporting is to explain the work to others, present your interpretation of the findings, and appropriately locate them within the framework of existing evidence and policy.
  • The audience

    Different audiences require different sorts of reports and presentation of the findings of evaluation. Consideration should be given to the need for several reports from a project as a means of ensuring that all those who need to know the results and implications of the evaluation are informed. Reporting is a time consuming exercise, and unfortunately is frequently skimped. Time should be budgeted within the evaluation to do this, and to present the results in a variety of formats for different audiences.

    Consideration of the following sorts of questions will determine the most appropriate form and style of reporting.
    - Who are the audiences (managers, service delivery staff, academics)?
    - What questions do they want answered?
    - What were the original evaluation objectives?
    - Has the context for receiving the results changed since then?
    - How much background does the audience need?
    - Which aspects of the work should be given greatest prominence?
    - What type of presentation will be best received and understood?
    - Will one report/presentation meet the needs of all audiences?
  • Case study

    Delivery report on the Lincolnshire sex offender programme

    This is an example of a routine management report provided in Lincolnshire Probation Service (1998). It illustrates the point that ‘operational’ reports (see Table 3.1) meet a different need from ‘strategic’ ones. A report is produced after each run of the sex offender programme according to a standard format.

    Throughput – nine people were due to start; all actually started, and all completed the programme.
    static data – profile data on each offender includes: age, sex, race, offence, risk assessment
    impact on attitudes – average pre- and post-programme scores are given for the Sex Offence Attitude Questionnaire (SOAQ) instrument
    feedback from participants – highlights were given in text form
    cost – total cost of running the programme is given, covering staff costs and incidental expenses.
  • Alternative outlets for reports

    Evaluators of probation practice, particularly those who are employees of the service rather than externally employed academics, should consider reporting their evaluations more widely. This will include writing contributions for journals such as Vista or Evaluation, or relevant academic journals such as British Journal of Criminology or Howard Journal. There are also practice journals where reports could be presented, such as Probation Journal, Community Care or Justice of the Peace.

The evaluation report

Evaluation, if it is to be accessible to and understandable by key stakeholders, must depart from the trends of the various social science disciplines and return to simplicity as a virtue in data presentations. (Patton, 1997, p. 310)

A written report of the evaluation will be needed, even where other mechanisms of dissemination are also employed. Without some ‘hard’ format for the results of an evaluation the impact of the exercise is likely to be minimal and dissipate over time. The traditional academic style of report follows a well established format which starts with an introduction outlining the background to the study and the questions it was designed to address. This style of report will then describe the methodology of the study in some detail before presenting the results and, finally, a short discussion of those results.
This format is not the most appropriate for the report of an evaluation, where the needs of the audience are different. Implications for policy and practice are the most important aspect of evaluations and thus should be given prime position in both order and quantity. There is less need for justification, particularly in the form of detailed description of the method and full analysis.
- Give an executive summary or bulleted list at the beginning outlining the key points from the report.
- Put supplementary materials, which some readers may find useful but are not essential to understanding the main points of the report, in - appendices.
- Organise the report clearly with a logical flow through the material being presented.
- Present evidence economically to substantiate the key points made.
- Use plain English in a simple style to improve the report’s readability and value.
- Keep your sentences short, informative and jargon free so that they are easy to understand.
- Use simple straightforward presentation of material to give decision makers access to the findings.
- Use charts as an effective way of presenting material succinctly.

The conclusions of the evaluator and recommendations for policy and practice based on the results of the evaluation are the most important sections of the report. It is helpful (see Patton 1997, p. 307) to distinguish between
findings – the results of the evaluation, what has been learned
interpretations – inferences drawn from the findings, including the relationship of this work to previous research and evidence; inevitable complexities of the findings should be articulated rather than sacrificed in the search for simplicity of presentation
judgements – made by the evaluator within the context of the evaluation, such as what is positive and not so positive about the findings, and whether the benefits are worth the cost
recommendations – suggested courses of action based upon the findings, highlighting their relevance to policy and practice.

It is a good idea to share this section, informally during drafting, with service staff and managers to check out alternative interpretations, judgements and recommendations.
Overall, the contents of the report must be easy to understand and enable the reader to make a reasonable judgement about the quality of the evaluation, the worth of the results and the implications for policy and practice. This simplicity does have a disadvantage for the evaluator in that it disguises the considerable amount of work that has been involved in the exercise, and the vast amount of data and analysis that has been sifted and honed to the key points presented. Unfortunately this can sometimes lead to the uninformed reader believing that good evaluation is easy!

Bell (1987, p. 135) has a useful checklist to use on the final draft of a report, which includes the following questions.
- Is the meaning clear? Are there any obscure passages?
- Are conclusions based on evidence? Have any claims been made which cannot be substantiated?
- Are recommendations feasible?

This checklist is also a useful guide for critiquing reports prepared by others.