View in portable document format.
Faculty Council Meeting
22 January 2007
Course-Instructor Survey (CIS)—Student Comments
and Electronic CIS Surveys
During Fall 2006 as The University began wider testing of electronic course-instructor surveys, the University attorneys determined that student open-ended comments on electronic surveys would be subject to open-records requests whereas similar comments on handwritten course-instructor surveys are not available to the public.1 This legal determination results in (1) a disparity between the two methods of gathering student evaluations and (2) the potential for wide and long-lasting dissemination of student comments—either in whole or selected—through the internet.
The CIS Committee has several concerns about the public availability of student comments. . One drawback is that while student comments have value to faculty for improving their teaching, research indicates that student comments cannot be understood at face value. Research (see Appendix A) indicates that student objective-answer evaluations are generally useful for evaluating teaching since they are directed to specific features of the teaching which research indicates is important to teaching effectiveness and student outcome. However, proper use of student open-ended comments require more skilled interpretation, and the commonly accepted professional view of student comments is that those evaluations should primarily be used by instructors for improvement of their own teaching (formative purposes) and should not be used for personnel decision-making (summative purposes). Thus, without education in the appropriate interpretation of course-instructor objective-answer scores and student comments, student comments have limited benefit other than for the instructor to improve his or her own teaching.
However, of more concern here is the potential harm to an instructor when the occasional dissatisfied student submits written scurrilous comments made without factual justification and possibly utterly unjustifiable. Yet the potential exists for the widespread and in-perpetuity dissemination of such comments to be published on the web, out-of-context and without verification. This could wrongfully harm the instructor, and it could put The University at risk for making available and facilitating the distribution of these anonymous comments.2
Research conducted on UT’s course-instructor surveys indicates that objective-response answers and open-ended comments provide essentially the same evaluation of an instructor (see Zimmaro). Because of problems such as variable interpretation and lack of assured representative sampling with the optional open-ended comments, twice in the past seven years the Faculty Council has affirmed that student comments should not be considered an official part of the course-instructor surveys but rather are private communications between students and instructors. The 1999 Faculty Council ad hoc Committee on Course Instructor Surveys reported regarding student comments: “Written comments on student evaluation forms are not governed by the Open Records Act that governs access and release of answers to survey items. This communication is offered and received as a private act of communication between student and faculty. The committee recommends that the privacy of both the students and faculty be respected and preserved and in so doing it becomes the purview of the faculty member as to how and when these are released.” The 2005 Ad hoc Committee of the Faculty Council to Study Course Instructor Surveys repeated this point in its November 21, 2005, report to the Faculty Council, and the Faculty Council affirmed this in its motion that “The official University CIS will consist of only objective questions.”
Thus, on behalf of the Faculty, we request:
1) The Provost direct the University lawyers to determine a way to make the student comments on electronic course-instructor surveys not available as public documents or cease the inclusion of that feature on e-surveys.3
2) The Provost remind students and the public that valid concerns regarding instructors can be made through other means than the end-of-the-term official course-instructor surveys. These means include communicating directly with instructors and their supervisors
1 As explained in a notice to faculty on December 4, 2006, from the Chair of the University’s Committee of Counsel on Academic Freedom and Responsibility and approved by the University attorneys: “All CIS documents whether in paper or electronic form are official government documents and are available to University employees who have a legitimate educational or business need for them. As is the case with all government documents, they are presumed to be public documents available to the public upon request in accordance with the Texas Public Information Act unless some exception to the Act applies. The Texas Attorney General is responsible for enforcing and interpreting the Act and has held that handwritten student comments are excepted from disclosure under the Texas Public Information Act because handwriting is identifiable, and personally identifiable information about students is confidential pursuant to federal law. Moreover, the University is not under obligation to type those comments that might then make them available. The situation is significantly different for the eCIS.
Typewritten student comments are presumed to be available for release to the public under the Texas Public Information Act, provided that any personally identifying information about the author or other students was first redacted. These anonymous comments could be available for collection, posting and archiving on various external, third party websites or by other means. However, any person in the general public seeking access to such information must follow the university's procedure and submit a written request to the Vice President for Business Affairs, who will ensure that the university complies with the law. Furthermore, the current records retention schedule for CIS and eCIS is 10 years, and all official documents and copies retained by the university are expected to be destroyed at that time in accordance with official records destruction procedures, making further disclosure by the university not possible. This, however, does not preclude their disclosure for longer periods of time through third-parties that acquired the student comments during their 10-year window of availability from the university. The public availability of these student comments collected via the eCIS system may affect your survey choice.”
2 One optional solution to this problem is to require students who provide comments to identify themselves although this is a practice at odds with the traditionally anonymous surveys. We prefer the solution proposed below.
3 Currently for the UT eCIS, student comments are not linked to the individual evaluations, making their formative use much more difficult and contrary to UT’s own directions to faculty regarding how to interpret course-instructor surveys and student comments. If The University continues to do a eCIS that includes student comments, Measurements and Evaluation needs to find a way to keep the individual statistical evaluations distinct and connected to specific individual student comments.
Selected Comments from Research Studies on
Course-Instructor Surveys and Student Comments
Below are excerpts of some of the reasons for the need for appropriate interpretation of objective-answer student responses and student comments. The literature also concludes that student evaluations are one very important feature in evaluating teaching effectiveness. Other features for evaluating teaching effectiveness include peer-evaluation and student outcomes. Thus, this list of cautions about student evaluations and, particularly student comments, is offered to remind us that student comments are just that—“comments,” and should be weighted accordingly.
Questions about Student Evaluations in General
- “Students expectations of grades influence their ratings of teachers” (Valen Johnson referred to in Gray and Bergmann, 45-6).
- Daniel S. Hamermesh and Amy M. Parker’s study at The University of Texas concludes, “instructors who are viewed as better looking receive higher instructional ratings, with the impact of a move from the 10th to the 90th percentile of beauty being substantial. This impact exists within university departments and even within particular courses, and is larger for male than female instructors.”
5255Formative and Summative Uses of Student Comments
Research on Student Comments (Zimmaro’s recent study is of UT’s CIS Evaluations)
Two primary reasons exist for evaluation of instruction: “improvement (formative) and personnel decision-making (summative). . . . Problems associated with using written comments [for summative purposes include]:
“ Validity. Student comments are valid to the extent that the comments are relevant to instructional issues and to the extent that students can be expected to tell the truth about their opinions. . . . “ Generalizability. . . . Without a systematic sampling strategy, written comments cannot be assumed to be representative of student opinion within a class . . . . “ Reliability. Single written comments and patterns of comments are often understood differently by different interpreters. Without a strategy for interpreting written comments, subjectivity can easily distort interpretations . . . . Systematic content analysis methods are available to address this problem; however, their implementation requires skills and resources. “ Practicality. . . . truly appropriate use of comments would be expensive and difficulty to enforce.” (Franklin and Berman)
- “. . . students who write comments tend to have stronger opinions, positive and negative” (Franklin and Berman)
- “Ory and Piper (1980)[‘s] results indicated that all three methods [objective response items, written responses to open-ended items, and group interviews] provided the same quality of information in that the method of collecting student evaluation information did not influence the ranking of professors. However, method of collection did influence the kind of information in that written comments and group interviews provided specific information for improving instruction that the quantitative information did not.” (Zimmaro, 1)
- This would suggest that student comments are generally redundant for summative evaluation.
- This would suggest that student comments are important for formative purposes.
- “The students who write positive comments are more likely to rate the instructor and the course as ‘excellent’ than are those who write negative comments.” (Zimmaro, summary)
- “Students in small and medium sized classes were . . . more likely to write positive comments than students in large classes.” (Zimmaro, summary)
- This indicates that course size biases written comments. (Zimmaro et al. do not indicate the level of the courses, but since at UT—where the study occurred—it is likely that large classes are lower-division courses, and small and medium-sized ones are upper-division. Thus course content may also be influencing the outcome.)
- Students who self-reported an expected higher grade of A were more likely to write positive comments than students who self-reported a lower grade. (Zimmaro, summary)
- “As probable course grade decreased from ‘A’ to ‘F’ students were less likely to write positive comments.” (Zimmaro, summary 3)
This indicates that expected grade correlates with written comments and may indicate a bias or a third factor as a cause.
- Students who rated the course workload near the insufficient or light end of the scale were more likely to write positive comments than students who rated the course workload near the heavy or excessive end of the scale.” (Zimmaro, 13)
This indicates that course workload correlates with written comments and may indicate a bias or a third factor as a cause.
5256Works Cited and Reviewed for this Report
Franklin, Jennifer and Elena Berman. “Using Students’ Written Comments in Evaluating Teaching.” Instructional Evaluation and Faculty Development 18, no. 1 (November 1998): http://iaes.arizona.edu/PAGEFILES/documents/facdev/iefd-18-1.html#.
Gaede, Charles S., Marilla D. Svinicki, and Dawn M. Zimmaro. “Predicting Student Evaluation of Instructor and Course: Revisiting the Relationship among Course Grades, Improving Teaching, and Summative Evaluation.” University of Texas at Austin. No date.
Gray, Mary and Barbara R. Bergmann. “Student Teaching Evaluations: Inaccurate, Demeaning, Misused.” Academe (September-October 2003): 44-46.
Hamermesh, Daniel S. and Amy M. Parker. “Beauty in the Classroom: Professors’ Pulchritude and Putative Pedagogical Productivity.” July 2003. http://www.eco.utexas.edu/faculty/Hamermesh/Beautystuff.html.
Zimmaro, Dawn M., Charles S. Gaeder, E. Joel Heikes, Mi-Suk P. Shim, and Karron G. Lewis. “A Study of Student Written Course Evaluation Comments at a Public University.” University of Texas at Austin. No date.