Mathematics professor Neelesh Tiruviluamala is the highest-rated professor at USC, according to the user-generated ranking platform RateMyProfessors.com. His profile reveals tags that recognize him for his “amazing lectures,” his “caring” and “hilarious” nature and his ability to provide students with “good feedback.”
Large block letters at the top of Tiruviluamala’s page show that 100 percent of his students who were surveyed would take a class with him again, and according to the site’s metrics, Tiruviluamala has garnered a perfect score from the 56 reviews he has received.
Because students are able to read and write reviews of virtually any educator across the nation, Rate My Professors has amassed more than 19 million reviews of various teaching professionals since. Founded in 1999, the site says its mission is to do “what students have been doing forever — checking in with each other … to figure out who’s a great professor and who’s one you might want to avoid.”
At USC, students are encouraged to complete end-of-semester online course evaluations, which provide officials insight into faculty’s teaching effectiveness and are associated with promotions and tenure decisions, according to Vice Provost for Academic and Faculty Affairs Elizabeth Graddy. However, students are not able to access their peers’ evaluations or generally view a professor’s rating.
Students like Kayla Brown, a freshman studying health promotions, turn to Rate My Professors as a resource in determining professors who would offer her a valuable learning experience, and she hopes USC would consider publicizing students’ course evaluations.
“Some [students] have good experiences and some have bad experiences [with a professor], but if we’re all reporting those experiences, the sample data is just a larger pool,” Brown said. “Versus Rate My Professors, you’ll find two, three, maybe four to five posts. So if we have a bigger view [on the course], I think that will be immensely helpful.”
Although Brown acknowledges Rate My Professors’ shortcomings to provide a holistic perspective of a professor’s class, she considers it a preparatory overview before she decides to register.
“It’s actually influenced my class choices,” she said. “I definitely prefer [knowing how professors are], because then I know what I have to do to do better [in the course]. So if I know that one professor is a really hard teacher, then I know from the beginning to stay on top of the readings.”
With the ubiquity of these public review forums — such as Yelp, TripAdvisor and Rate My Professor — a culture of shared opinions and reviews has developed online.
Yet, these sites can be influenced by a voluntary response bias, in which users with extremely positive or negative experiences dominate the review process, according to a New Yorker article on workplace reviews. Similarly, students unfamiliar with a platform like Rate My Professors, or indifferent to its effects, might be entirely absent in the survey results.
The higher merit of USC-admin-istered course evaluations still does not eliminate bias in students, Graddy said, and they are not released publicly due to the University’s privacy concerns.
“What we do know is that there tends to be, in most of these evaluations, a bias against women and faculty of color, and so that’s problematic in our office,” Graddy said. “We’re trying to figure out how we can improve our instrument. I think the question is how do we try to keep improving the instrument itself to make it more specific. So, the more specific it is, the less likely, I think, those biases are to come in.”
The questions on USC’s evaluation form aim to measure similar metrics on Rate My Professors’, such as clarity and approachability outside of class, yet are more specific and address the strengths and weaknesses of the professor.
Though the evaluations are supposed to be mandatory, response rates have fallen dramatically since they switched from paper surveys to electronic forms, Graddy said.
“One of the things we’ve observed now is more incivility in the responses since they’re online,” she said. “There are some things that would disturb you in these course evaluations … and I wonder if students would write them if they knew they’d be made public.”
Rate My Professors’ accuracy garners mixed reviews among academics. Like Brown, Tiruviluamala noted the site’s reviews are highly subjective and leave out important information about a teacher’s overall teaching style from an academic perspective. He compared the site’s popularity to social media comments; a new phenomenon to the digital generation.
“[Teaching] is not like a movie; it’s a 15-week linear algebra course,” he said. “It’s different from what [an academic] would look at. Does this person push students that’s good for them long term in ways they don’t even know?”
A professor’s teaching style — how “hard” or “easy” they are — can affect a student’s academic career in the long run when they’re learning harder concepts, according to Tiruviluamala.
In addition, gendered language in teacher reviews can also skew results seen on the site, according to a 2015 data compilation by Northeastern University professor Benjamin Schmidt. In his analysis of around 14 million Rate My Professors reviews, Schmidt said male teachers were more likely to receive positive words than their female counterparts, and female teachers received more reviews with fashion-related words.
Despite potential implicit biases and the platform’s crowdsourced format, some students feel the anonymous platform provides a space for honesty. Elizabeth McAtee, a freshman studying classics and philosophy, politics and law, believed the site is a good indicator of a professor’s teaching style and class demeanor.
“I think not necessarily every single detail about Rate My Professors is 100 percent accurate, but the overall impression that it gives of a certain person can be useful in determining what a professor is like or what class you should take,” McAtee said.