Although the response rate from faculty members has gone up significantly, many people still feel the annual dean evaluation process needs a little "tweaking," said Ohio University Provost Stephen Kopp.
Every year Group I faculty, tenured faculty or faculty on track for tenure, evaluate the dean of their respective college by filling out a one-page questionnaire and rating them on a scale ranging from unsatisfactory to outstanding. This year response rates averaged 55 percent, up from last year's average of about 40 percent. But for the surveys to be statistically sound response rates would have to be 75 percent, Kopp said. Only the College of Education exceeded that threshold.
"A problem is that not everyone responds," said Kathy Krendl, dean of the College of Communication. "Usually the people who write have more passion than those who don't. It just shows that not everyone takes these evaluations seriously."
A committee made up of four to six faculty members from the college reviews finished evaluations. The College of Arts and Sciences is the only college with a six-member committee because of its size, said Barbara Reeves, associate provost for academic affairs. Faculty Senate elects half the committee and the other half is chosen by the provost's office after the dean makes recommendations.
This year, out of the ten colleges on campus, five deans were given an annual review and two Raymond Tymas-Jones in the College of Fine Arts and James Heap in the College of Education were give comprehensive five-year reviews.
Kopp, in his first year as provost, made a few changes to the process with the goal of increasing what had been lackluster response rates. Without adequate response rates Kopp said the evaluations are only opinion surveys.
One change involved standardizing the survey for all colleges. Previously, each college developed its own survey, Kopp said. This year, the core questions were the same and the survey only ran one page.
"That was a huge advantage to the system," Heap said. "It made comparing colleges a little easier."
Another difference involved meeting and talking about the evaluations. The deans met with their respective committee and the committees in turn met with Kopp.
"I feel we've done a lot to improve the process, as evident by the increased response rates," he said. "The goal is to improve what we're doing and encourage the best practices."
Several of the deans felt that Group I faculty should not be the only people to evaluate the deans.
"College deans serve all faculty, staff, other colleges and external collaborations," Kopp said. "The lens that the faculty look through may be limited because they don't have the only contact."
The final concern shared by several deans was anonymity. Before going to the committees, the deans or the provost, all responses are rewritten by an impartial third party.
Because responses are anonymous, some comments become personal statements instead of true evaluations, Kopp said.
"Someone always has an axe to grind," he said. "I applaud the deans for their professional handling of the situation."
Response Rates: College 2002-03 2001-02 Arts and Sciences 34% 20% Business 64% 65% Communication 56% 44% Education 79% 46% Fine Arts 57% 22% Health and Human Services 50% 47% College of Osteopathic Medicine 46% * Honors Tutorial College * ** Engineering * *** *deans are not evaluated in their first year ** HTC dean Joseph Berman was retiring and was not evaluated in 2001-02 *** The college of Engineering had an interim dean in 2001-02 Source: Post Research
17 Archives
Katie Primm