June 29, 2009

The end of the world as we know it...

Lately I've been engaged in one of my least favorite past-times, political in-fighting. My university, firmly situated in BFE Oklahoma, has not yet felt the financial pressure many East Coast and West Coast universities are feeling. We tend to lag the nation and have been somewhat cushioned by increases in state revenue due to high energy prices. We're not immune though, and worry over next year's budget is palpable. The response of some administrators to budget pressures has been to require faculty to charge academic year salary on grants to help offset costs. As you might imagine there is push-back from faculty, especially since NSF has changed their salary policy to limit total grant-derived salary to two months per year.

While I am not directly involved in the budget process--a privilege/responsibility jealously guarded by administrators--it is not much of a stretch to see the money going to support costs of the new research buildings going up all over campus. There was a recent article in the Chronicle of Higher Education in which a panel from the Association of American Universities, an organization that represents top research universities, asked the National Academies to study whether the country needs fewer, but better, research universities.

This is an interesting question. On one hand it is logical to have doubts about the purity of the motives of a panel selected by an exclusive club. "You need to be spending more money on research, but don't waste your money on 'those' people. Your money would be much better spent at the 'right' schools... really, you can trust us!" My knee-jerk response is the one that ends with "...and the horse you rode in on.". The Stanford scandal showed what happens when university administration sees research as a source of unrestricted funding. On the other hand my (admittedly limited) experience as a graduate school at a research university and faculty member at a research "wanna-be" has driven home that there are deep cultural differences between institutions. Top research schools have a culture that supports the Herculean trial that is good research. While research is possible at other schools it is simply more difficult. As my favorite proverb states- "It is not just the mountain ahead, but the grain of sand in your shoe that wears you down". A thornier issue is that if large, second tier schools make the substantial investment in research facilities, as my university has chosen to do, it is likely that there are resources that are not flowing to student learning (discussed in a previous post).

Really it seems that the question comes down to culture. I don't buy the argument that just because a university has been successful in the past at research they should be on the "short list" for more research funding. I've sat on enough proposal review panels to know that the system already favors research schools. But I'm also experiencing first-hand the stumblings of a university trying to create the "cultural shift" required to rise in the ranks of research schools. What lessons can be drawn from schools that are efficient or effective at research (rather than simply do a lot of research)? How does one change a culture, and what is a realistic time constant for the change? What are the signs that a school is even open to cultural change? Without asking these questions--and doing the research necessary to answer them--the question of whether the US can afford the current number of research schools is simply throwing gasoline on the ego-pyre that is research.

No comments:

Post a Comment