Assess technology

Reporting results

How you present results depends on your audience, your purpose, the stage in the evaluation or assessment process, and the resources available. To address these factors, you are likely to need several reporting formats, such as a formal written report and more interactive communication like informal meetings and presentations. Reporting should take place throughout the evaluation or assessment process.

Working sessions

Small, interactive meetings with key stakeholders, called working sessions, enable them to ask questions, express their views, propose ideas, and build consensus at different points in the evaluation or assessment process. Present preliminary results, highlighting key findings, and use feedback from stakeholders to interpret findings and determine how they might be used. Because working sessions are interactive and informal, they can be particularly effective in formative evaluations or assessments. Feedback from stakeholders can reveal what additional information is needed and guide the design of a future assessment. For example, after discussing the findings from focus groups, you might collaborate with key stakeholders to develop a survey.


Verbal presentations are more formal than working sessions, take 30 to 60 minutes, and focus on a limited number of topics. Visuals like PowerPoint slides and handouts are typically used. Briefly cover the evaluation purposes and methods, but spend the majority of time presenting results and recommendations and fielding questions. Make sure the presentation is easy to understand, especially when presenting to individuals whose time is limited. If resources are available, you may wish to include short animated computer presentations or video clips. Communicate just one main point per slide and avoid excessive text. Include any detailed tables in handouts. Practice your presentation and identify places to involve your audience. Avoid reading text off slides.

Written report

Determine the central message

Determine the central message of your report by distilling it into two sentences. Tie results to the central questions. If the questions are lacking or vague, state that.

Know your audience

Know who your audience is, particularly stakeholders, and anticipate what they are most interested in knowing. Your audience, in large part, determines how much background and technical information to provide. Avoid using jargon or the passive voice and check that long sentences are clear. Write and distribute the report in a timely manner.

Preface report with a summary

Begin the report with a two-page executive summary, concentrating on findings and recommendations. Briefly describe the program or technology and its central goals, explain the purpose of your evaluation or assessment, and devote just one paragraph to describing the method(s). Present key findings no more than two-thirds down the first page and consider using headlines to highlight them. Avoid footnotes and put the main point of each paragraph in the first sentence. In many cases, an executive summary is also used as a stand-alone document.

Background section

In the background section describe the program history or technology, objectives, accomplishments, and participants in greater detail.

Method section

In the next section, describe your method(s) accurately, but succinctly, and briefly discuss any shortcomings. For a survey, report response rates, and how they were calculated. If you decided to weight responses, explain why you did so and how you computed weights. State the dates the survey was administered or disseminated, the survey mode (in-person, telephone, mail, or electronic), and the population sampled. For all methods, report the demographic characteristics of participants and the number of initial participants and those who dropped out or were excluded from analyses for any reason. For an experiment, describe what educational activities took place in the treatment and, if present, the control group. If you analyzed data from interviews or focus groups using qualitative methods like coding, explain the steps in the process. Cite any relevant literature and provide references at the end of your report. Include a copy of the instruments you used in an appendix.

Findings and conclusion section

In the findings and conclusions section, present positive results first. Report means of outcome measures and provide statistics that are meaningful to your audience. For example, if you conducted a field experiment, you might report that scores increased 35% for an intervention group compared with 15% for the control group. With surveys results, provide the complete wording of the questions you discuss along with frequencies and percentages of responses. If your sample was not representative of your target population, provide a caveat that you cannot make strong conclusions or generalizations. When reporting interview or focus group data, support assertions by quoting participants, including those who expressed a minority point of view. Place quotes in context by generically describing the participant (female, history major) and the topic she is responding to, being careful not to reveal her identity. Order focus groups findings by questions asked during the session. Discuss only questions relevant to your conclusions and recommendations and don't report tentative findings. If you use several methods, decide if you will present results for each method separately or integrate results from several methods when discussing an important issue.

Summarizing results

To summarize key results and conceptual relationships, present findings in a table, concept map, graph, or diagram. Keep tables and figures simple and self-explanatory. For tables and graphs, provide a clear, descriptive title, the number of people responding to the question, and necessary keys and footnotes. For graphs, label both axes clearly, using equal intervals on the X (horizontal) axis. If the Y (vertical) axis does not begin with zero, clearly indicate this. If you use bar charts, all bars should have the same width. Recheck the accuracy of data.

Cross-tabulation tables can illustrate discrepancies between needs and services. For example, the table below shows that participants were more likely to report receiving inadequate than unnecessary technical assistance:

A table can also summarize common program or technology benefits or challenges:

Benefit % Yes

Learned program management skills


Improved time management


Improved course instruction


Learned new information about campus resources


Learned about a specific piece of equipment



State conclusions based on the findings you present, drawing on patterns in the data, and, when appropriate, information from other methods and sources to provide a fuller picture. Discuss strengths and limitations of the study. Discuss the impact of the findings and factors that appear to be contributing to results. For example, if results show that objectives are frequently not met, it may be useful to discuss how this is impacting quality, participant satisfaction, and perceptions. Analyze the context so its influences can be identified. If the program or technology did not meet its goals, describe any special contextual factors, such as delays or lack of resources that explain poor results.

Avoid misrepresenting results because of political pressure or the desire to secure future contracts. Discuss findings with administrators, but if they strongly disagree with your conclusions, test the validity of their objections by gathering additional information or consulting experts, rather than simply yielding to their wishes. Any potential conflict of interest should be dealt with openly at the outset of the evaluation or assessment.


Recommendations should follow from evaluation or assessment conclusions and consider all possible explanations. Like conclusions, they should be linked to study objectives. Develop concrete and practical recommendations in consultation with stakeholders to increase the chance they will be implemented. Torres (2001) suggests positioning stakeholders as problem-solvers rather than culprits. Recommendations for incremental change or adjustments to current practices are more likely to be adopted than those that call for a major overhaul of an organization. Although administration should be consulted, sometimes evaluators need to make recommendations that administrators may not support. In these cases, make recommendations as diplomatically as possible.

Additional information

Grob, G. F. (2004). Writing for impact. In J.S. Wholey, J. P. Hatry, & K. E. Newcomber (Eds.) Handbook of Practical Program Evaluation (2nd ed.) (pp.604-627). San Francisco: Jossey-Bass.

Torres, R. T. (2001). Communicating and reporting evaluation activities and findings. In D. Russ & H. Preskill (Eds.) Evaluation in Organizations: A Systematic Approach to Enhancing Learning, Performance, and Change (pp. 347-380). Boston: Perseus

Page last updated: Sep 21 2011
Copyright © 2007, The University of Texas at Austin