A little follow-up to the article referenced in "Heresy 101?". This one details the fate of the principal of a minischool in NYC that would teach some courses in Arabic in addition to courses on Muslim culture. Here's an excellent example of a reactionary reaction to the attempted marriage of education and religion, though at least from reports I've seen so far, this attempt seemed fairly sensible.
Today I got to dig in a bit more deeply with classes. Part of that was a sort of improptu manifesto on education and its merits to a group of mostly freshmen. I explained to them a bit about my own educational background - I was the first person in my family to get a degree. I'm the only one with a graduate degree. I did it on student loans and a whim. It was that sort of deal.
And part of what I told them - maybe I should have held over the "Heresy 101" title - was that I believe in Education the way some people feel about Religion. Now note - and I'm sure most of them didn't get this impression - I didn't pick out a particular religion. I didn't compare education to any savior, philosopher, or writer of a tome that attracts tons of Hollywood money. Instead, I told them something personal. Education, I told them, has brought me more than tons of debt, and even if that were it, then every dollar I'll be paying back till my 70s was worth it. That feeling, I said, is what I hope to help them come away with four years from now. I told them that what you get out of education is about what you're willing to risk. I said that they're only going to get out what they put into it. And that to get the most out of it, they have to confront things they don't like, things they fear, things nobody thought they'd ever possibly care about. You have to be okay with being wrong sometimes. I told them I'm not out to shock them or convert them. They don't have to agree with me because, among other things, I'm wrong almost as often as I'm right. I told them going to college to get a job is like going to the pool to only play in the shallow end. I told them I hoped they'd find some way to enjoy their education, and that I'd help them if I could.
It was, I think, the best lecture I've ever given, and I barely said a thing about the subject at hand. It was one of those moments where I felt like if they took even one thing I said out of the lecture, I'll have succeeded for the entire term.
Remember that faculty breakfast and dog and pony show? Well there was more to it. One other interesting point in the recent faculty breakfast and cattle show was the address by the university president Without giving too much away, the University president is among other things, a fairly good natured person, concerned with faculty ideas (or at least that's the sense I get based on his regular invitations to take faculty and students to lunch or dinner), and a Catholic priest. And in his address to the incoming and old faculty, he did something not so surprising, all things considered: he suggested that it wouldn't be a bad idea for faculty to begin classes with a moment of prayer.
Now this prompted all the usual responses:
you can't make us pray, we take public money
you don't have to pray, you can have a moment of silence
we're at an institution with a religious bent - and with a priest as president - so you can't really be surprised at this
etc, etc, etc.
Part of the president's point was that it wouldn't be strange to ask one part of a university to support another part of the university's mission. My thought, choked down as so many of them have to be considering my own views on religion, was "Great. I'll pray at the start of class when you spend 5 minutes at the beginning of each mass talking about proper grammar and punctuation."
But in the interest of charity, Christian or otherwise, I want to try and avoid the usual negative rant that this might inspire. After all, in some ways the sort of debate discussed in this New York Times article becomes so strangely polarizing as to be useless. So instead, let me set the stage with a bit of background.
Like many people I know, I've got a distrust of religion that's deep and glaring. I'm not comfortable saying it is because I'm liberal - I think it would be interesting to see what a liberal religious movement would be capable of politically. I've even debated tackling a book that would play with some of those ideas. And I don't think it's because I'm well-educated. I know plenty of well educated people who have some strong faith or other.
The roots of my distrust that I'm willing to point to come from direct experience with religion. I spent some time in a middle school affiliated and run by members of a particular religion, and the experiences there were more than enough to make me shudder at all manner of things. It's because of my time there that I don't trust big groups clapping at the same time. It's because of my time there that I feel a little nervous any time a large group of people does anything in unison. And there are later experiences that have only deepened this distrust. And I would be remiss to ignore that I do feel like I benefited from being dragged to church as a child when the choice was still my parents. There were moral lessons that I take from there that I think were quite valuable. And there were communal lessons as well.
But sniping aside, there's a big question that wouldn't hurt to play with. Let's assume for a moment religion isn't going away. Not here, not in the larger culture. If that's the case, sooner or later, those of us who are liberal, well-educated folks are going to have to offer some notion of how to integrate religion of all varieties into education (in the same way that I think the left must sooner or later claim some religious space as well).
Part of the difficulty seems to be the assumption that the only way to be sensitive to religious beliefs is to facilitate religious practice. Why is it that the notion of appropriateness of timing seems to fly out the window in the American context (perhaps it does this elsewhere, but I couldn't say)?
I suppose, as a colleague of mine noted, we could do simple things to bridge the gap. In his case, it was designing a syllabus that was so frightening even non-believers would hope for divine mercy. That's one solution.
I'm not sure I've got a list I could easily put together, though perhaps I'll try one later today, as I'm sure an afternoon of meeting with students and administrators will absolutely crystallize the sense of my own mortality as well as the knowledge that I could be doing much more interesting things with whatever minutes I've got left.
I'd never thought about it before reading this article from the New York Times, but education in America serves, as much as anything, as a sort of calendar marking time. I didn't think about it when, as an undergrad, I mentioned to my interim advisor that I didn't have a plan for my electives - at least not a plan that he'd agree involved any forethought. Instead, I told him, I was going to take the courses that sounded interesting. This was clearly a foolish, wasteful idea in his eyes: the sign of a student set to drift along through college and probably not even make it out.
Why did I do it? In high school, I'd been assured by guidance counselors, by well-meaning teachers, even by friends with the same concerns that college was to be different from high school: we'd all get to do what we were interested in, not what the state or school board or our parents told us we needed. No more Calculus if I didn't want it, no more Government and Economics. College was about my desires. And though college wasn't the land of milk, honey and free academic choices I'd been promised it wasn't so far from it either. And so, I spent electives on things like World Literature courses, a First Amendment law course, a Jazz history course, even a feminist theory course (not because I thought I'd meet chicks there, either). When it was time to graduate, my school only allowed you to have one minor. But I'd jumped around enough without ever abandoning my major to have had five.
A lot of the same thought drives my views on education today. I'm happier because I got to explore. I even wound up taking more of those pesky Government and Economics courses than my high school career (largely spent drawing pictures of my Econ teacher in various embarrassing costumes) would have suggested. Along the way, I learned not just about cultural anthropology, not just about media history, or even about the practice of experimental psychology but also about smart decision making, about how differently people I thought I understood viewed the world, and even a bit about where I needed to be in the world. It was, as college is often assumed to be, my first real taste of freedom.
But it also took me five years and a summer session to graduate, and today I've got the student loan debt that reflects that particular desire. Even looking at that debt didn't spur me to this new thought: education is a form of Taylorism (if you're not sure what that means, see the definition offered here). What my first advisor had been trying to tell me was that taking longer than four years in college intentionally was a waste of time. It was unproductive.
What that article made me think about was a sort of cultural norm about when we become adults and what it means to be an adult in this society (look: a term from my cultural anthropology course!). At the end of high school, we assume people become adults: they're ready to make decisions, to serve their country, to pay taxes, etc, etc. But most importantly, they're ready to contribute in the most basic way: by getting a job. And if you're not ready to get a job, then the only excuse is to go to college. Why? So that when you're done, you can get a better job (huh - that course in sociological theory doesn't seem so crazy now...).
These are assumptions, of course (okay, so that logic course did pay some dividends). And perhaps unfortunately, we've normalized them - we've made rules based on those assumptions. And the modern academic system is built around them - it even penalizes "bad" decision making. A college education is assumed to take four years. If it takes more, you can find your access to financial aid diminished, you're guaranteed to have to answer in interviews why it took you so long to get out, and you may even take a hit to your reputation with friends and family ("Oh, you know he just drifted about awhile. I always thought he had more direction...").
But what if, as the New York Times article suggests about high school, a longer time spent in college might offer a different set of benefits than simple productivity?
Just a few days ago, I was in advising a student, and I found myself making the assumption for them that they need to be done in four years. And the student, not surprisingly for someone just entering college after years of having older authorities pass down unexplained proclamations, accepted it. Maybe I should have asked what she wanted to get out of her college career? Was she in it for a job? Or was their something more? And if she answered that she wanted something more than the quickest route to a slightly better job, how would the system have to work differently?
As I prepare for the first week of classes, I think I may have just hit on the first essay for my freshmen. I'll let you know how they answer.
So this morning, I attended the annual welcome back faculty breakfast where we get to briefly gawk at new faculty who're made to pirouette while the higher-ups get attempt to use their vitas as part of a comedy routine. Meanwhile, while we - the entrenched, bored, on-our-best behavior old faculty - stuff our faces with free cafeteria food (I am increasingly convinced there is nothing more enticing to academics than the notion of a free lunch). Also this is where the University president traditionally exhorts us to some grand "moral" goal (the quotes are there because I don't necessarily equate the morals the president espouses with my own). And over the last two years, they've also started introducing us to the university's new marketing plan. That's right, kids, greasy bacon and academic commercials!
Oddly enough, most of us don't ever think about how universities market themselves. I'd imagine if you wanted to sell a university, you'd have to choose the best way to reach your ideal demographic. You'd probably edit your information to appeal to a 19 year old. Or maybe you'd even suggest, in addition to the official information, that someone try to figure a way to make the kids who're already at your university make their own commercials.
First, you target them, then you infect them. Viral marketing has come to the university.
Pause a moment and think about what that would look like. Would it say much about education, amidst all those fast edits and off-center camera angles? Would you even want to? Ever search for your university's name on YouTube? Could be worth looking at whatever turns up.
Just for the sake of completion, I should mention that today I let both my Dean and department chair know that I'm planning to apply for jobs this school year. I'm hoping that both will contribute a letter of reference for me - of course, if they for some reason say no, then I'm in a bit of a quandary.
Last year I did a very small job search, mostly in hopes of landing someplace near my aging parents. And there was some oddness with my department chair over the matter of reference letters. It wound up being something of a semantic game: "You asked if I'd be willing to write a letter; you didn't actually ask if I would write one." that ultimately worked out, but that gave me a scare about the whole process. For a moment, it felt like my battleship was in a wading pool and someone just dropped a torpedo in the water with it. My career - where I lived, what I did, and all the things that extend from that: happiness, a social life, health - could be tanked because I forgot to say "Mother, may I?"
Life in academia carries with it a continuing sense of indebtedness in a way that I've never experienced as acutely anywhere else, even in my several years in the kiss-ass or die corporate banking area. Obviously there's intellectual indebtedness - that's why we cite things so religiously. And certainly I owe a lot to the various faculty and colleagues who've asked me to read something, who've challenged what I asked them to read, who argued with me over drinks at the bar on Thursday nights. But it goes beyond that. Having done both hiring in both and firing in the corporate world, I remember all too well how references worked there. When I was at the bank, checking references was essentially limited to calling an employer and asking "Did Person X work there?" There can be all sorts of legal hassles if they try to tell you that Person X was a bad employee, plus most people find someone who isn't going to hang them anyway.
In academia, the reference can be everything. You don't have to look far to find someone in academia who owes their career (or their lack of it) to the person who chaired their dissertation committee. When I first looked for an academic job, one of the interviews I had owes no small debt to the friendship between my undergraduate advisor and someone working at the school in question. Academics aren't afraid to call someone they know where you worked and asked for all the dirt. And they're not afraid to share with anyone and everyone who'll listen.
Increasingly (at least in my experience) potential employers want detailed reference letters at the outset of an application process. It used to be - or so I'm told - fairly common to just list contact information, and if an employer wanted more details, they'd contact references. But in the years I've been jumping from place to place and job to job, the majority have wanted letters.
Where this really becomes problematic is when you're casting a wide net. My first year of job searching, when I was A.B.D., I applied to 108 jobs. Imagine if even a quarter of them wanted a full letter (and I'm fairly certain it was more than that). Now imagine that you've somehow done someone wrong (or that they think you have).
So presently, there are four jobs out that I'm interested in. The theory is that since I've got a couple of publications and a book contract plus good teaching evaluations, that this is my best year to jump to a new ship. I'll keep you posted how it goes.
It's around this time each year that U.S. News and World Report issues its annual college ratings. This is a bigger deal for colleges than for anyone, even the people who use them because it helps not only in recruiting students but in pushing for external funding.
What most people don't realize is just how tricky (read: near-crooked) the ratings are. For example, a recent article in the Wall Street Journal reported that when two colleges corrected information on their alumni-donation rate, their ratings slid. [Edit: they've since corrected this - only one of the two schools' ratings slid].
But what isn't so widely reported is that part of the rankings are based on the perceived status of the college or university by other colleges and universities. In other words, what other schools report they think of your school can determine your school's ranking. If just parsing that sentence takes a moment, you can be sure there's more fun in the works. Clearly, that sort of system seems like it shouldn't be a problem, right? In a world where education becomes more business than anything every day, we can count on truth over the bottom line, right? I mean, if we asked Coke executives to tell us what they thought of Pepsi, of course they'd be honest with us. The method is more than a little suspect, and it has resulted in some schools boycotting certain parts of the system (often leading to falls in their own rankings) and to ask for more information about the methodology itself, even claiming the system favors private schools.
It's amazing how often these things occur, though. Recently the school I'm at sent out messages to employees about a regional survey seeking to find the best places to work. There's no self-promotion there, I'm sure. What's interesting is that I've never heard anyone report on the ten worst places to work in a region though it seems like knowing that could be just as important.
For those of you who're curious, the contract was ratified. Here are the various things you may have placed bets on:
not only was there a quorum, more then 2/3 of the faculty voted
the contract was ratified by a little over 4:1
And in other news, it's been red-headed step-child day, with "new" furniture gifted from other departments. Maybe next year, one of 'em will give us their last dollar so we can take a shot at the Golden Ticket.
And that said, today has been an absolutely awful day. The reason why isn't really something fit to be mentioned here. But I've got a couple of dear friends who are having a worse time of it than I can imagine. It's not often I'm at a loss for what to do for friends, and this one has floored me. So if there is actually anyone reading this, take a moment and think a good thought for them and their loss. And be thankful for what you've got.
Those of you trying to learn how to successfully negotiate a contract from the management side, take note: a key to ensuring a strong bargaining hand in negotiation is to redirect mistrust and frustration.
I've mentioned a bit about contract negotiations here, and in my previous post gave some example of how the proposed contract effectively pits faculty against each other on the way to a more equitable pay rather than against the university. Today, the conversation has shifted to something more interesting, and (again) the argument is coming between faculty.
As with many colleges the size of the one I teach at, the faculty workload is an issue. Here, faculty teach four courses per term. This is then cluttered up in terms of preps. For example, I might teach a term with two preps, but four total courses. There are a few competitive opportunities to get the number of courses reduced, and there are reductions built in for things like chairing a department and teaching certain numbers of graduate courses. Now it's worth noting that the majority of graduate courses at this school happen in the Business and Education programs (coincidentally (?), these are the same programs that have the highest pay). Making matters worse, the college's core curriculum - the courses that all students have to take at the undergraduate level, regardless of focus - are taught in programs that are on the low-pay, high-prep end of the scale. And the more courses you teach, the harder it is to do research which is one of the keys to getting that most holy (though it isn't what the average person on the street thinks) of academic goals: tenure.
Imagine having to prep lectures for four courses a week, grade the papers for all those courses, advise students, serve on a committee or three, and then try to figure out time to write papers for conferences and publication. Effectively, under a 4/4 load, the only research that gets done happens on your (unpaid) vacations, so you can imagine that workload is an issue.
The proposed contract has received some concessions on this score. Over the course of four years, the workload will be reduced to 3/3, with the number of course reductions decreased correspondingly. Seems like something everyone could get on board with, right?
Remember those reductions I mentioned that some people get for teaching graduate courses, etc? Well, those reductions become harder to claim if workload declines. The problem is this: while the number of courses that a faculty member has to teach go down, the number of courses that have to be taught doesn't. So you either have to hire new faculty (or adjuncts) or you give up reductions. This is where the faculty today's faculty-on-faculty aggression has hit. You see, in the short run, those people who have been able to get reductions based on graduate teaching - often to a 3/3 load under the current system - will lose those chances and will have to teach what the rest of the faculty have to teach.
Effectively, these faculty who are already paid more and teach less find themselves in the position where they're being asked to work more than they're accustomed for less of a raise than they might get otherwise. And faculty in the areas that pay worse are confronted with the fact that they teach more and earn less. There have been inferences made that graduate faculty work less, that helping one group is hurting the other.
And the University? Well, it's an interesting note that in none of the communications that I've been privy to has the University itself even been referenced.
Apologies for being away so long. It was summer, and I stopped reading some things in favor of reading other stuff, and that didn't translate into good blogging material. Plus my summer courses didn't take, so I've not really been around the university (or around anywhere, really, as I've had big things to save up for - see the end of the next paragraph). But it's almost time to get back to it, and so, back to the blog.
Ignoring for a moment that academics don't really get summer vacations - we just get to stop teaching and drawing paychecks in favor of trying to do the research we've not had time for during the regular terms - summer vacation has been pretty good. I've managed to get two papers ready to be pitched, a couple of proposals coming up, one of my students got a job in her field and another is applying for graduate school, and I spent two weeks in Paris.
That ending clause is so nice, I'll say it twice: "and I spent two weeks in Paris."
But the really interesting bit (after spending two weeks in Paris, calling anything interesting this town is a stretch, but we'll run with it) is that the small school I'm at is in a contract negotiation year. The faculty here are unionized, though the union was described to me in orientation as a "company union." I've since been assured - though not by anyone actually representing or actively involved with the union - that such a description was an unfortunate choice of words and simply not true.
Last week the proposed new faculty contract was presented to a meeting of about 40 faculty members, and a few interesting things happened. I'm only going to tackle two major things here, though there were more. Best to leave something for future blog entries, right?
First, there was considerable distress - particularly from "new" faculty (who make up a majority of the faculty here) that they weren't included in the process. A quick check for e-mails (and I should note that I'm at a school where e-mail is THE way of communicating - on an average day, it isn't unheard of for someone to send an e-mail announcing they have extra staples they don't need and, 10 minutes later, an e-mail saying the staples have been claimed) shows that there has been no mention of the union since Fall term when a social was announced, at which more information was promised. It never came. So imagine the surprise when given the response was that faculty simply didn't care enough to get involve; if they'd really cared, they would have found the working groups they had no idea were being formed and continued.
But the second interesting thing was the proposed raise schedule. At this school, faculty raises are set for whatever period the contract stipulates - usually three years. And faculty raises help to determine the raises for staff at the university. I'm told that typically what the faculty get, so do the staff. The proposed raise this year is 3.5% (incidentally, that's less than the rise in cost of living according to the Bureau of Labor Statistics). But the way it breaks down is a little strange and is the second bone of contention about the proposed contract.
Before breaking it down, some things to keep in mind:
different areas in academia tend to be paid at different levels (for example, faculty of business tend to make considerably more than faculty teaching liberal arts courses). The typical reasoning for this that business faculty would make high salaries in the business world and so need to be paid more to teach (there are all sorts of flaws with this logic, but that's not today's point)
faculty hired recently tend to make comparatively less than faculty hired 10, 20, or 30 years before. If you adjust incomes for inflation, the more recently a faculty member has been hired, the more likely it is that they're being paid relatively less.
With that in mind, here's how the new contract proposes that raise would work. All faculty members will get a 2% increase on their base. The university will then total all faculty salaries, give that total a 1.5% raise, then take the new total and divide it equally among all faculty.
So imagine Faculty member X, who has been at the university for two years, makes $45,000 and Faculty member Y, who has been at the university for 25, makes $100,000. If the raise were a straight 3.5%, X would make $46,575 and Y would make 103,500 (before taxes, of course). With this proposed raise (assuming these two were the only faculty members), it would work out like this: X gets $46,600 and Y gets $102,700. I'm betting you can see what the bone of contention is here.
Here's a moment for you game theorists out there. What do you think will happen? Faculty who feel largely unincluded in the negotiation process are given the choice of whether to approve a contract that effectively closes the gap between faculty members by pitting faculty member raises against each other. Think it'll pass? And before you place your bets, you should know that whether the contract is approved or not is determined by a simple majority of voters with no requirement of how many voters need to turn out. And the faculty have a little under a week to approve the contract.