Faculty Senate to vote on proposed electronic, face-to-face course evaluations

Todd+Diacon%2C+the+provost+and+vice+president+of+academic+affairs%2C+speaks+during+the+Faculty+Senate+meeting+Monday%2C+Oct.+9%2C+2017.

Todd Diacon, the provost and vice president of academic affairs, speaks during the Faculty Senate meeting Monday, Oct. 9, 2017.

Valerie Royzman

Faculty Senate is expected to vote on a new recommendation for student course evaluations at its first meeting of the semester Monday.

The suggested electronic, in-classroom Student Survey of Instruction, otherwise known as SSI, will address concerns across departments at the university and provide students with a better opportunity to evaluate faculty, in turn yielding valuable feedback for professors.

“I think there was a growing sense that there were certain aspects of the Student Surveys of Instruction that were not really serving us very well,” said Jennifer Marcinkiewicz, the director of the Center for Teaching and Learning.

Marcinkiewicz is chairing the Student Survey of Instruction Review Committee, which is charged with reviewing student course evaluations and the university’s implementation of them.

The cost of paper of in-person SSIs, coupled with the long hours spent sorting through them, is an issue Kent State has grappled with for years.

”The university devotes a lot of time and money to paper evaluations fed through a scantron,” Marcinkiewicz said. “There’s a lot of administrative time devoted to this. There are usually departmental secretaries that spend hours and hours putting together the packets.”

Switching to strictly online SSIs, or Flash Surveys, isn’t a solution, however. Online course evaluations tend to see significantly low response rates, Marcinkiewicz said. Another concern regarding the current SSIs is questions that don’t concentrate on the different ways in which students take classes.

For example, questions addressing the physical space of a classroom appear on the SSIs sent to students enrolled in online courses.

“There’s an interest in having things fairly uniform so that a faculty member teaching in an online class and a faculty member teaching in a face-to-face class can have some significant overlap in terms of what they’re evaluated on,” Marcinkiewicz said. “And yet, the old survey did not really recognize that there were fundamental differences.”

Robert Trogdon, the chairman of the English department, said paper SSIs get a better response rate compared to online. From what he’s seen in his five years as chair, the paper response rate is about 70 to 80 percent, though he has no records of exact response rates.

“Typically when I look at evaluations for instructors for online classes, sometimes there will only be two responses out of 24 students,” Trogdon said. “It’s sometimes not even 50 percent of the class.”

Comments from students on SSIs are few and far between, though they are helpful when issues arise about an instructor, Trogdon said.

“If I see a pattern in a class where a lot of students are saying the same thing about an instructor, that is helpful to me about identifying problems,” he said. “If it’s one student, it’s not as helpful sometimes because it could just be a disgruntled student.”

Low participation in student commentary, both paper and online, remains a common trend for the political science department too, chairman Andrew Barnes said. He tends to witness the same patterns Trogdon outlined in terms of paper SSIs.

Over the last three years, “participation has historically been between 50 percent and two-thirds” of students who complete paper SSIs in person, Barnes said.

Unconscious student biases are another worry of the committee.

”I would say that there are some very legitimate concerns that depending upon the identity of the students that are evaluating the faculty that racial, gender, religious, LGBTQ status, all of those things do have the potential to bias student evaluations,” Marcinkiewicz said.

It’s for this reason that changes in reporting SSI results no longer rely on norming groups, which Marcinkiewicz defines as “a comparison group of faculty teaching similar classes.” Essentially, faculty members were evaluated and set side by side to an average, and whether they fell above or below measured their success.

Overreliance on this comparative data sparked concern when the committee began developing this electronic, in-person SSI recommendation about two years ago.

”It’s an unfortunate thing that it’s not a consistent misuse but it happened frequently enough to cause some concern,” Marcinkiewicz said.

Barnes said use of the norming group in the political science department was convenient, but there were some flaws.

“Figuring out how to measure the quality of teaching is a difficult thing, but it’s an important thing as well, so it’s an issue we need to keep working on,” he said.

Trogdon is an opposer of the norming group method and said the values it yields are “not statistically significant.”

“It became much much too easy for people to latch on to those numbers and go, ‘They’re good because they’re above the norm and they’re bad because they’re below the norm,’ he said. “That’s a lot of weight to put on an evaluation that students spend sometimes 10 minutes doing.”

Last spring, the Student Survey of Instruction Review Committee performed an electronic pilot on students in the “First Year Experience” course, along with those in theater and nursing courses. The pilot featured questions about faculty geared toward four specific topics: commitment to learning, creating an environment of mutual respect, challenging students to think and explaining material clearly.

The pilot also included a drop-down comment section, which asked follow-up questions based on initial responses. For instance, if the question asked students about whether a professor shows mutual respect and the student disagreed or strongly disagreed, it would ask to provide examples and suggestions for improvement.

“Students wrote more than four times as many comments using that particular directive prompt than in the traditional paper scantron,” Marcinkiewicz said. “So that was a really, really exciting result.”

The committee is also recommending a list of supplemental questions so departments and individual faculty members can receive input geared more specifically toward their curriculum.

If approved by Faculty Senate, Marcinkiewicz predicts summertime would be the earliest conceivable time the new electronic SSIs will be implemented, though the timeline is up to Faculty Senate and the administration.

The review committee itself won’t be making a recommendation for a potential vendor, but eXplorance Blue, the company that administers Flash Surveys, may be considered.

Marcinkiewicz said the committee’s recommendation is about encouraging “a much more holistic view” to help faculty truly improve their teaching.

“We want to make sure student voices are heard — that’s first and foremost,” Marcinkiewicz said. “Secondly, we want to make sure that faculty are able to use this information in an informed way, and we want to make sure that everything is done to ensure fair and equitable evaluation of faculty teaching for personnel decisions.”

Valerie Royzman is an administration reporter. Contact her at [email protected]