Assess technology

Example: Usability recommendations

Usability recommendations Resulting actions - IAR 2.0

1. No modification indicated for the site’s content, clean look, and well-designed glossary.

  • Revised content to reduce jargon and technical terms
  • Reduced white space between glossary words and incorporated user requested pop-ups
  • Updated appearance and graphics

2. Reorganize information. The current structure does not match the ways in which users search for information in a Web site.

  • Organized information by different user groups or tasks
  • Reorganized bibliography by task
  • Tailored processes and data gathering methods to each user group/task
  • Simplified the organization of the assessment process and developed a new graphic
  • Reduced jargon and technical terms
3. Reexamine all labels to assure they directly map to the information they index. Users had difficulties in telling what labels referred to, particularly the labels Best Practices, Interpreting Data, and Module 1-3.
  • Simplified the labeling for the assessment process into three phases
  • Described major tasks in the simplest terms possible (e.g., Instructional Assessment -> Assess technology)
  • Established a consistent organization and navigation for all tasks
  • Focused entire site around tasks/user groups rather than the assessment process

4. Reduce the amount of information per page and minimize the need for scrolling. Information-packed, long pages have at least three disadvantages for users:

  1. they usually dislike scrolling
  2. miss information when scrolling
  3. may not notice information they need
  • Chunked and linked long documents to reduce scrolling
  • Reduced extraneous information through editing
  • Reduced amount of text on home page and second level pages

5. Make the breadcrumbs stand out.  Using the different color may help. In particular, make the > character stand out more clearly.

  • Replaced breadcrumbs with tabs that identify which task the user is currently in. 
  • Used links and menus to create a more logical flow, as well as provide autonomy, for the user.

6. Make important information stand out. Important information should jump out at users; the present design requires them to hunt in down.

  • Used different colors as well as font size and menu order to emphasize document importance
  • Also, guided the user to more important information through the organizational web design (e.g., User begins with “Planning” phase already open)

7. Reduce the white space in glossary pages. Some participants did not like the white space in the glossary pages, because it required more scrolling up and down.

    • Eliminated white space

8. Make both PDF and HTML pages available. Some participants did not like PDF links, because such documents loaded slowly.

  • Incorporated all PDF pages as HTML content when feasible

9. Use more graphics and other media.

The pages in the site are text heavy.

  • Created the three-phase assessment process graphic. 
  • New web site template also provides more  color and graphical appeal than the former.
  • Made text less heavy by eliminating jargon, chunking and linking long documents, and deleting unnecessary information.  
  • Made use of tables and flow charts to summarize information

10. Present examples modeling the nine-step process framework or highlight their strength. Some users preferred to look first at examples, yet the examples on the site are studies conducted at other institutions, so it is not easy to see their relevance to the methods, processes, or designs discussed in the IAR site.

  • Created additional examples that directly relate to common user tasks

Also added:

  • Information on student assessment and conducting educational research
  • Additional content for assessing instructional technology
  • Additional content for evaluating programs
  • Assessment and evaluation FAQ’s

Page last updated: Sep 21 2011
Copyright © 2007, The University of Texas at Austin