Senate Committee on Instruction, Curriculum and Advising
Posting Student Instructional Rating Data to the Web
Report on S-0008: "Review and report on the feasibility/desirability
of posting student instructional rating data (currently available on CD-ROM
in University libraries) online. Investigate whether the university should
provide additional methods of accessing course-evaluation information online,
such as establishing a university website for anonymous posting of comments
on instructors performance."
Background: Newark and Camden have a long history of compiling
and making student instructional rating data available for their campuses.
A 1992 university document mandated that data would be compiled on all
campuses and summary reports would be produced and would be used in personnel
decisions, as negotiated with the AAUP. However, it was not until 1995
that the New Brunswick Faculty Council agreed that summary data, based
on the Newark model, could be released. This decision as to what data could
be collected, made available, and what it should be measured against, was
the result of a unit-by-unit consultation process and represents a university-wide
The New Brunswick Teaching Excellence Center (TEC) was made responsible
for the compilation and distribution of the data. Data for each semester
is put on CD-ROM and is made available in the University Libraries. Students
are then able to use the data as an information resource when considering
course registration options.
However, with the onset of online registration and the increasing expectation
of electronic resources being available in a networked environment, restricting
information which is to be used as part of an online process to a CD-ROM
which must be used in the Library is a source of frustration to students.
Consequently, the New Brunswick TEC has received requests to make the teaching
evaluation data available on the Rutgers website. There have also been
individual requests for the establishment of a site where students could
anonymously post informal comments and evaluations.
Process: In the course of its deliberations, the Senate Committee
on Instruction, Curriculum, and Advising has consulted and/or received
recommendations from the University Senate Student Affairs Committee, the
Rutgers College Governing Association, and the staff of the New Brunswick
Teaching Excellence Center. The Committee also looked at the websites of
other AAU institutions to see how they are dealing with this issue.
Posting of Current Data: The data that is currently available on CD-ROM
exists as a searchable database on the New Brunswick TEC server, with access
currently restricted to TEC staff. The Senate Student Affairs Committee
and the Rutgers College Governing Association (RCGA) both recommend allowing
web access to the data as it currently exists, and most of the Committee
agreed that easing student access to this information would be beneficial
and that the data should be available to the University community on the
Explanatory Materials: Some Committee members felt that the data in isolation
could be misleading, and that the statistical anomalies (e.g., validity
of data for very small/very large classes, validity of data with minimal
returns, validity of comparative data, etc.) needed to be addressed. A
preliminary statement explaining the data is available on the CD-ROM and
on the TEC site; the TEC expressed a willingness to modify that statement
to include more information on data limitations and the statistical anomalies
that might be encountered when using the data.
Expanded Questionnaires: Some departments have chosen to add their own
questions to the base form in order to make them more meaningful to the
teaching culture of that discipline/department. [For example, the FAS Physics
Department uses five separate evaluation forms depending on whether the
course is an introductory course, a lab, a recitation, a seminar, or a
straight lecture course.] Where departments have added these department-specific
questions to the base form, the ratings are available in the released data,
however the questions themselves do not display and are known only to the
department. Since this is potentially very useful data that is already
being compiled and reported, the Committee felt that, where the departments
agreed, these additional questions should display in the database.
"Open" vs. "Closed" Data Site: While there are some institutions that have
open access to course/teaching evaluations (e.g., Indiana University; University
of Chicago), these are institutions which offer profiles or narrative summations
rather than statistical data. Of the AAU institutions surveyed, those that
post statistical data all either require a log-in, or restrict access to
on-campus (IP checking) addresses. Since access to the data is essentially
for internal purposes, i.e., to give students better information with which
to make course-selection decisions, as well as for use in conjunction with
personnel actions, the Committee agreed that there would seem to be no
valid need to make the data available to the general public.
Log-in vs. IP Checking: Of these two options, the Committee felt that a
secure site requiring log-in would be preferable. Restrictions by IP address
would require that either students would only have access to the data while
physically on campus, or that a proxy server be set up for remote access.
Links: The Senate Student Affairs Committee and the RCGA recommended that
there be a direct link to the relevant rating data next to each specific
class listing on the Rutgers University Online Schedule of Classes. However,
whereas the course synopsis that is to be linked to the course listings
this year represents relatively static data, the evaluation data, and subsequent
links, would have to be changed every semester. In terms of the labor involved,
the Committee found that this was simply not a viable option.
Online Form Completion: All parties consulted agreed that the majority
of students would not bother to go online to complete the evaluation forms.
This conclusion is in line with the experiences of other institutions where
online evaluations have been attempted. Since without participation the
data would be rendered useless, the Committee agreed that this is not a
Narrative Commentary: Students are asked to write additional comments on
the back of the data sheets when filling out the evaluation forms. Currently,
those comments are returned to the instructors for their use, and are not
compiled or distributed. These comments could potentially be very useful.
Indeed, there was general agreement that the University of Chicago's course-specific
narrative summations might be the ideal model to follow in terms of utility.
However, the amount of staff time required to do something comparable in
an institution the size of Rutgers would be prohibitive. The alternatives
-- either scanning and posting, or retyping, the comments portion of the
completed forms, or allowing students to comment online -- did not seem
to provide benefits that would outweigh the labor required or the inevitable
contention that would result. The student groups did not recommend narrative
options; the Committee concurred.
The student instructional ratings database, access to which is currently
restricted to Teaching Excellence Center staff, should be made available
to the University community as a restricted database accessible only by
log-in through a university account. The site should include an expanded
general statement explaining the data and its limitations.
There should be a link from the Rutgers Online Schedule of Classes site
to the database site. On the general Rutgers site, there should be a link
to the database from the "Information for Faculty and Staff" page (www.rutgers.edu/faculty-staff.shtml)
and from the "Information for Current Students" page (www.rutgers.edu/current-students.shtml).
Where departments have added questions to the base form, the Teaching Excellence
Center should seek agreement from those departments to have those questions
"How to Read the
Student Instructional Rating Form"
Whereas, the University Senate's Instruction,
Curriculum and Advising Committee has examined and reported on Posting
Student Instructional Rating Data to the Web; and
Whereas, the University Senate has reviewed
the Committee's Report and its Recommendations,
finding those Recommendations to be sound and in the
best interests of Rutgers University;
Therefore, Be It Resolved, that the Rutgers
University Senate endorses the Report on Posting Student Instructional
Rating Data to the Web, and urges the Administration to implement its recommendations.