← Previous Post: | Next Post:

 

A second study finds that…

Rate My Professors is a good source of information about university teaching.

Why does ratemyprofessors work as well as it does? Researchers at University of Wisconsin-Eau Claire answered that question by surveying students about their use of ratemyprofessors. They found that people who post on ratemyprofessors are basically typical students — although men were more likely to post than women, and students in the arts and humanities posted less than those in other disciplines. The motivations for posting are varied, but the two most important are “warning others about an instructor” and “communicating that an instructor was excellent.”

Margaret Soltan, September 16, 2011 7:55AM
Posted in: professors

Trackback URL for this post:
https://www.margaretsoltan.com/wp-trackback.php?p=32487

7 Responses to “A second study finds that…”

  1. Conservative English PhD Says:

    Interesting. Perhaps over time they do provide a good measure. I don’t claim to be excellent, but my student ratings (done at the university) so far are fairly high. However, I only have three reviews on RMP, and they’re all extremely negative.

    But then, I’ve only been teaching full time for a year, so we’ll see what my RMP reviews are like after three or four years (and actually, despite the reviews being negative, I did modify my teaching this year somewhat, figuring that I should use negative reviews as an opportunity to improve, rather than just ignore them as disgruntled students).

  2. GTWMA Says:

    RMP is a good source of information…if you have at least 14 RMP evaluations, and it’s a much better source for identifying good teachers than for identifying bad teachers, according to the articles.

  3. david foster Says:

    Most academics seem to feel that the formal teacher evaluations that universities conduct by surveying students are not very reliable, and indeed that they systematically over-rate easy graders and profs who don’t require much work. Which makes intuitive sense.

    Why would the RateMyProfs data not suffer from similar problems?

  4. Mike S. Says:

    @David Foster
    Why is RMP in some manner different than course evals done at the universities themselves?
    The students who rate profs on RMP actually had to be motivated enough to go to the site and input some data. Based on that, one surmises that these students feel they have something of value to contribute, the ratings are the the way that useful info is communicated to others.

    University administered course evals are passed out to nearly every student (which is usually at the end of the course – ie exam review time – or at the exam itself). It’s just another one of life’s meaningless chores to those students and they fill out the form in a disinterested manner, whatever they put down doesn’t reflect reality b/c they don’t care, it’s just useless data. In a single moment they come up w/ the rating and that’s how much of it ends up being simplistic in nature. The though process of “class was easy, so probably this deserves a decent rating” is a simplistic and lazy one, exactly the kind of thing people do when told to do something they have no interest in, it gets the chore out of the way.

  5. Margaret Soltan Says:

    Mike S.: On in-class evaluations: I totally agree. Many departments — amazingly — continue to stand by them, even as superior electronic systems are widely available.

    Another thing about the in-class evaluations that makes them even less likely to be taken seriously — they typically ask TONS of questions, as opposed to the lean, mean RMP- type systems. And since often each department fashions its own form, there’s a real quality-control problem.

  6. Conservative Englis PhD Says:

    My institution has moved to an entirely online evaluation system – no more in class evaluations. The response rate (so far) is around 1/3 it was when they were handed out in class. That may change, but it does mean the students who comment have stronger feelings.

    My own subjective experience indicates that negative feelings tend to be more of a prompt than positive ones, but we shall see how it works in the long run. Overall, my online evaluations weren’t much different than the in-class ones – we just had fewer of them.

  7. Margaret Soltan Says:

    Conservative English PhD: Response rate is certainly an issue.

    The studies I’ve seen seem to suggest that positive as well as negative feelings are prompts for filling these things out.

    But you point to another reason I like the online system – it’s absurd to force students to fill these out by taking up class time and making a big deal out of it. People have a right to opt out of rating their professors. Some schools provide incentives of various kinds, which is fine. But making it mandatory just invites lots of meaningless results.

Comment on this Entry

Latest UD posts at IHE

Archives

Categories