It’s the last week of the semester, which can only mean one thing (that’s right, only ONE thing): it’s course evaluation time! Most faculty want to receive good scores on their course evals. With the right strategy, we can also get good information from them and use that information to improve our teaching.
It’s true that student evaluations are imperfect means of evaluating teaching quality. But because they are so systematic, these evals are, practically speaking, the best available tool for finding out how students respond to our teaching. Like any tool, they need to be used properly, lest anyone get seriously injured.
In May I responded to Rebecca Schuman’s criticism of course evals with a blog post that appeared on the Chronicle of Higher Education website. That post prompted a comment thread on Facebook that turned up some fantastic ideas for how to get more out of your course evaluations.
In that thread, Temple University sociologist Dustin Kidd (follow his blog; buy his book!) offered a comprehensive–and illuminating–view of how he works with evaluations. Dustin reports that generating useful responses starts with how he administers the evals. His advice is to
tell students what kind of feedback you want and how you plan to use it. I always tell students that I’m planning to revise the course and ask them to use evaluations to help me with the process. I give them a list of elements from the course that names all the readings, films, assignments, and other components and ask them to use the list to give me specific feedback.
Yes, students know, in one sense, how to fill out evaluation forms. But they do not inherently know how to make those forms most helpful to you. You may want to know what they thought about the first unit in the course, which has been on your mind daily for three months. But during the last week of the semester and with final exams looming, they may not recall that unit right now. So, following Dustin’s advice, why not remind them? It’s even possible that by reminding students of everything you’ve done in the course, they will also think well of the course overall, which can’t hurt.
Dustin further explains to his students the role that their evaluation of his teaching plays within the institution:
I have a very explicit discussion with the students about how the evaluations are used in my performance reviews when I apply for merit raises, promotions, or awards. I tell them the department uses the evaluations to make decisions about what courses will be offered and who will staff them. Making the process transparent helps the students to take it more seriously.
I suspect (though I do not know for sure) that at a large university like Temple, students are glad to know that their voices really do matter to the institution. Dustin conveys to students that they are taken seriously; in turn, one hopes, they will take seriously the responsibility to inform the department and university about Dustin’s teaching.
Finally, getting the very most useful information out of your evals does not necessarily mean using every last bit of information. Some filtering is necessary:
In brief, I try to summarize the evaluations in short phrases that strip away the judgement (positive and negative). I then quantify them so I know how many people made each critique. Then I focus only on the good and bad critiques that were said several times, as well as any helpful suggestions (even if the suggestion was only made once). The process helps me avoid zeroing in on the evaluations with really strong language and focus instead on the big picture.
It’s not surprising that Dustin takes a methodical approach, placing the evaluations within their institutional context and seeking to filter out the “noise” in the responses. He is a sociologist, after all. But this approach is not just good social science. It is good for the professor’s mental state, because it’s a way to avoid the twin pitfalls of vanity (upon reading a positive comment) and anger (upon reading a really negative one). What one student says might be interesting, but what many students say is real evidence, which can then inform your pedagogical decisions in later semesters.