Weirdly, when our team
said “let’s upgrade our server” got a message saying “we’re going to upgrade your server,” we didn’t expect you to redirect our DNS entry to a machine so new that it has no files on it. Not just no files, but no configuration whatsoever. And no users, so no way to, I don’t know, configure the machine such that one can put files on it.
I know. Silly us.
No worries, though. It’s not like we were running a large-scale scholarly community that depends on the goodwill of its volunteer participants, whose goodwill varies directly with the perceived stability of the platform.
Thanks, however, for giving the team a bit of clarity on the whole “hey, do we want to stay with this hosting provider or look for another one that might be better suited to our needs” business!
I’ve had an on-and-off romance with running for nearly 20 years now. I came to it late; I hated running as a kid, and I avoided it as much as I could in high school. And given that on the one hand I was pretty notably underweight until my mid-20s, and on the other, I grew up in a time and place that hadn’t yet been touched by things like girls’ soccer teams, nobody really forced me to think about anything like exercise. I joined a gym here and there; I took the occasional aerobics class. Never anything with any focus.
Until I went back to grad school. For some reason, that first semester at NYU I got serious. I went to Coles (which I recall being pretty shiny and new then) and took a prescriptive fitness class, where I learned a few basic things about exercise and was supervised through a range of circuit training programs. I remember spending a lot of my cardio time on a stair climber, until one of the instructors stopped me one day and said “mix it up a bit, Kathleen!” So I got on a treadmill and ran a mile in 10 minutes — the first mile I had ever run in my life. I was 26.
And I was hooked. R. and I started running together whenever we could. I was way slower than he was, always, but he pulled me along and got me to do more than I thought I could. And I ran by myself, too: endless tight little laps on Coles’s roof track, at first, and then once I moved to Hell’s Kitchen, early morning loops of lower Central Park. Those years, I was probably in the best physical shape of my life, and it was clear that the running was helping keep me on an even keel through the craziness of grad school.
But, being a grad student, I let the running gradually come, like everything else to be about Accomplishment. There’s nothing wrong with that, at least in the abstract, but it did something to the experience for me. It drove me to do more and more, well past the point at which I really should have just let myself settle into a more meditative routine.
In 1997, as I went on the job market, moved into high gear trying to finish my dissertation, and took on a full-time load of freelance work, my number came up in the lottery for the NYC marathon. And so I added training for that race to my schedule.
The marathon itself was amazing, though I ran it about half an hour slower than I’d hoped (partly for reasons out of my control; partly because of some less than optimal choices I made). It was an astonishing day, though, and I have no regrets whatsoever about the marathon itself. Training for the marathon, however, was another story. For months, I got up well before dawn to go run before settling down to work. I gave up hours and hours during the week, and pretty much a full day on the weekend, to running. And everything hurt pretty much all the time — not from an injury, just that overstressed, overused, constant ache.
I recovered slowly after the race, and gradually got back to a more normal level of running. Sort of. Something about all of those hours made me kind of dread running, and so once I graduated and moved to Claremont and started the business of being an assistant professor, I gradually… just… stopped.
Which is when the running dreams started, I think. I’d have these incredible dreams about running very strange race courses — across cities, in buildings, down stairs, through stores. Or I’d start running to try to catch someone, and just keep going. In my dreams, I was fast, and I felt great. A little nudge from the unconscious, I think, saying “don’t you want to feel this again when you’re awake?”
So I did gradually pick the running back up again, but wound up following the same cycle: ran well and felt great; ran more and felt better; decided to see if I could run another marathon. That one was Los Angeles, in 2005, and again the race itself went super well. And again, all the running before and after, a bit less so. I blew out one of my arches due to all the overtraining, and wound up with orthotics, which I never really got the hang of running with. And gradually, again, I stopped.
I picked the running back up a bit during my sabbatical a couple of years ago, but things started hurting again, and so I backed back off, trying to find my way to something that would be enough. Since then I’ve done some yoga, and a bunch of walking, but nothing has felt quite as good as running at its best has felt. And if I actually get to move into the apartment that I’m hoping I’ll be moving into soon, I’m going to have amazing access to another amazing park, and I want to be able to take advantage of that.
So I’m back at it, running again. And I’m trying to get myself to think about “enough” on the front end, as I’m starting up, rather than when things begin breaking down. I’m nearly 20 years older than I was when I ran that first mile, and I weigh a fair bit more, and things just don’t work quite like they used to. I eased my way into running this time with a lot of walking, and then slow short running intervals, gradually increasing them until I could run continuously. I’m a couple of months in, and it all feels pretty decent — nothing hurts, and I’m recovering from my runs well.
But I’m slow. What used to be my steady training pace is now my all-out intervals pace. I can feel my younger selves sneering at what my steady training pace has now become.
I remember telling R. years ago, in those early running days, that the key aspect of discipline for me was less about the need to make myself go do something than it was about the need to keep myself from doing too much. And so I’m trying to be very disciplined about things, to build strength slowly, to keep plodding forward, to focus on the years ahead rather than the miles right now.
It’s not easy to write or talk about doubts. The things we have doubts about are often precisely those things that are most important, both to us and to those around us: a relationship, a job, a major life choice. If they weren’t important, our ambivalences and worries wouldn’t reach the level of real doubt.
But those things are often so important that even feeling a little bit of doubt around them (did I make the right choice? is this going to work?) can become a crushing weight. Doubt in those cases seems tantamount to betrayal, especially when it’s clear that acknowledging those doubts would create anxiety in the people around us. How can you possibly admit to feeling doubt? It would only let everyone down.
Or, if it won’t disappoint someone else, doubt can feel like an admission of error — and the stakes of such error can be too high to countenance. (Having spent ten years preparing for a career, for instance, experiencing doubt about the choice not only feels like failure, but like a failure so long-term that it raises the possibility that one can have wasted one’s life tout court.)
So the doubt gets suppressed, stuffed into the corners of our lives that we ignore. And sometimes that works, and in the busyness of the day-to-day, in the daily struggles and triumphs, the doubt fades. But sometimes it festers in those corners, and feeds on itself and everything around it, becoming much worse than is necessary.
Finding the sweet spot between allowing doubt to metastasize and infecting others with it is an enormous challenge. This is the kind of thing that people rely on trusted advisors, therapists, clergy, and really close friends for — airing doubts with someone who won’t freak out, someone who can act as a reality check and reflect the doubt at an appropriate size.
I find myself, however, wanting to write about my doubts, to air them publicly, in part as an attempt to demonstrate — as I have found myself doing over and over with a range of professional fears and failures — that we all experience this pain. I’m confident, in fact, that we all (1) feel painful levels of doubt, precisely because that doubt is a core element of the intensely self-reflective careers that we have chosen. Not-knowing, uncertainty, insecurity, second-guessing — without them, we wouldn’t have questioning, investigation, development, growth.
So here’s the admission: I have doubts. Big honking doubts. Now more than ever. I’ve been asked more times than I can count over the last two years how my career transition has gone, how I feel about the change, and my standard response has been to say that 90% of the time, I’m absolutely certain I’ve made the right choice. And I think that’s all anybody can ask for.
What I don’t tend to say is that 10% of the time, the doubt can be all but paralytic. And I also don’t say that it’s gotten more intense lately, now that I’ve taken down the safety net. In fact, though, it’s been particularly acute for the last few weeks, as I’ve felt myself not getting done the things I want to do, and not doing well at the things I need to do, and as I’ve been left wondering whether I’m really cut out for this new gig at all, and what if I’ve made a horrible, terrible, irreversible mistake.
It’s not at all coincidental, I think, that my doubts — indeed, my self-doubts — have become so much more painful and pronounced just as I’m inching up on closing the largest financial transaction of my life: I’m buying an apartment in New York.
(That sound you hear is me hyperventilating.)
It’s not just a transaction with huge financial implications. It’s putting down roots. It’s not just saying “I’m not going back there,” as I did some months back. It’s saying “I’m staying here.”
And on a day when, for one reason and another, I just don’t feel like I’m good at my job, the weight of those doubts becomes unbearable.
* * *
I had a dream over the weekend that I think is about all of this doubt. I’ve been dreaming about work more or less non-stop for weeks, anxiety-filled dreams about trying to get stuff done and being unable to keep the details from skittering off everywhere. But this one was different: I dreamed I quit. I told the people around me that I just couldn’t handle it anymore.
Right in the middle of that, I remembered a couple of my projects — in fact, the biggest, scariest projects that are actually on my desk right now. I realized that I wasn’t going to be involved in seeing them through. And I was suddenly, crushingly, disappointed.
I wanted to be involved in those projects. I wanted to be the one who would get to see them through.
And so I ran off, trying to find Rosemary (hi, Rosemary! Don’t worry; it turns out well) to take my quitting back, to tell her I’d changed my mind. But I couldn’t find her, and I was horribly afraid it was too late.
And just as I told someone that, a huge airliner (2) came flying in right overhead. Way. Too. Low. And it pulled up hard, but too late, and it clipped the top of the building across the street, and flipped over, and fell to earth upside down.
Everything else I was thinking just stopped, and I stared at the upside-down plane. Literally: the upside-down plane. It wasn’t wreckage. It wasn’t on fire. It was just sitting there. And all the passengers, who I had been sure were dead, were filing off in an orderly fashion.
And I thought, Huh. It’s all okay.
Which is when I woke up, thrilled beyond belief that I hadn’t in fact quit my job, no matter how stressful it can be at moments. Certain I could work through the doubts.
* * *
I started writing this post on the subway yesterday morning, feeling as though I needed to do some public thinking about the nature of doubt and what it means for the choices we make. Got into the office, put it aside, and took care of business. And proceeded to have a day utterly full of win.
The doubts will — undoubtedly, ha — come back. But even if I crash, it doesn’t mean I have to burn. It is really possible, even when it doesn’t seem so, that it will all be okay — maybe because being willing to embrace the doubt means that I’m ready to do the impossibly scary things ahead.
- Okay, maybe not all. There are a very few people out there who are completely devoid of neurosis. I’ve met a couple. I wonder what they spend their time thinking about. ↩
- It was clearly marked as a Delta jet, a detail that was anomalously vivid. Why Delta? Was it wishful thinking, pushing failure off on the other guy, since I’m a United frequent flyer? This morning it hit me: not Delta, but delta. Change. Thank you, Dr. Freud. ↩
Back in the late spring of last year, I participated in a panel discussion on the future of publishing in visual culture studies, as part of the Now! Visual Culture symposium held at NYU. The panel organizers, Marquard Smith and Mark Little, have edited our presentations together into a brief collection entitled “Future Publishing: Visual culture in the age of possibility,” which they’re releasing today.
I’m very happy to have been able to participate in such a great discussion, and to be able to help spread the word a bit further. Please download, read, respond, repost; we look forward to hearing from you!
Lately I’ve found myself in one of those periods — perhaps we might refer to it as “my forties” — in which I’m so overwhelmed with the details involved in just keeping up with the most immediate and pressing tasks ahead of me that not only have I not gotten to do any writing, I’ve barely even found the space to contemplate the possibility of what might write if I had the time.
This makes me profoundly sad.
It’s not just about feeling too busy — it’s about the busy making me feel unfocused and unproductive. As though the big picture is slipping away in the masses of tasks that take up the work day and bleed over into evenings and weekends. And days off: not too many weeks ago, I’d made a pact with a friend to observe the oddity of the Presidents’ Day holiday by really making it a day off, celebrating by lying around reading a novel. Instead, I spent the day catching up on the many work and para-work tasks that just cannot be gotten through in the office. I got a lot done. I couldn’t tell you what, but it was a lot. It was kinda great, and kinda awful.
Another friend recently noted that I’ve come to refer to my plans to take a genuine day off by saying “I’m going to lie around and read a novel.” And as a professor of literature, at least in my not-too-distant past, I’ve got to marvel a bit at the association I’ve managed to build between novel-reading and leisure. Sloth, even: it’s not just reading, it’s lying around reading.
At some point, probably right about when I stopped teaching literature classes, the prior association I’d had between reading fiction and work began to fade. Reading fiction became play again, the way it had been when I was a kid. In part, the sense of fun in reading came back because I let it — I gave myself permission to read whatever I wanted, without any pressure to make use of what I was reading by either teaching it or writing about it. Without any pressure for the reading itself to be important. It was just about pleasure.
What happened shouldn’t come as much of a shock: I started reading more.
I’m looking now for a way to return that sense of play to my writing, to lessen the pressures that my preconceived notions of productivity have placed on it. I want writing to become a retreat from work again, rather than being all about work. I want it to be the thing I can’t wait to escape back into.
In order for that to happen, I think I’ve got to give myself a similar permission not to take it quite so seriously. What might be possible if I didn’t feel the pressure for my writing to be of use — if I didn’t need for it to be important? What if I could let my writing be just about pleasure?
Can I build an association between writing and goofing off?
Can a day spent sitting around writing come to feel like a holiday?
Tim McCormick posted an extremely interesting followup to my last post. If you haven’t read it, you should.
My comment on his post ran a bit out of control, and so I’m reproducing it here, in part so that I can continue thinking about this after tonight:
This is a great post, Tim. Here’s the thing, though: this is exactly the kind of public disagreement that I want the culture of online engagement to be able to foster; it is, as you point out, respectful, but it’s also serious. The problem is that I think this kind of dissensus is in danger as long as our mode of discourse falls so easily into snark, hostility, dismissiveness, and counterproductive incivility.
I don’t think it’s accidental that we are having this discussion via our blogs. I had time to sit with my post before I published it. You had time to read it and think about it before you responded. I’ve had time to consider this comment. And not just time — both of us have enough space to flesh out our thoughts. None of this means that by the end of the exchange we’re going to agree; in fact, I’m pretty sure we won’t. But it does mean that we’ve both given serious thought to the disagreement.
And this is what has me concerned about recent episodes on Twitter. Not that people disagree, but that there often isn’t enough room in either time or space for thought before responding, and thus that those responses so easily drift toward the most literally thoughtless. I’m not asking anybody not to say exactly what’s on their minds; by all means, do. I’m just asking that we all think about it a bit first.
And — if I could have anything — it would be for all of us to think about it not just from our own subject positions, but from the positions of the other people involved. This is where I get accused of wanting everybody to sit around the campfire and sing Kumbaya, which is simply not it at all. Disagree! But recognize that there is the slightest possibility that you (not you, Tim; that general “you” out there) could be wrong, and that the other person might well have a point.
So in fact, here’s a point of agreement between the two of us: you say that we need to have “the widest possible disagreements,” and that “to be other-engaged, and world-engaged, we need to be always leaning in to the uncomfortable.” Exactly! But to say that, as a corollary, we have to permit uncivil speech, public insult, and shaming — that anyone who resists this kind of behavior is just demanding that everyone agree — is to say that only the person who is the target of such speech needs to be uncomfortable, that the person who utters it has no responsibility for pausing to consider that other’s position. And there, I disagree quite strongly. (As does, I think, Postel; being liberal in what you accept from others has to be matched by being conservative in what you do for the network to be robust.)
I do not think that it should be the sole responsibility of the listener to tune out hostility, or that, as a Twitter respondent said last night, that it’s the responsibility of one who has been publicly shamed simply to decide not to feel that shame. There’s an edge of blaming the victim there that makes me profoundly uncomfortable. But I do think that we all need to do a far better job of listening to one another, and of taking one another seriously when we say that something’s just not okay. That, I think, is the real work that Ryan Cordell did in his fantastic blog post this morning. It’s way less important to me what the specific plan he’s developed for his future Tweeting is (though I think it’s awesome); it’s that he took the time to sit down with a person he’d hurt and find out what had happened from her perspective. It’s not at all incidental that they walked away from their conversation still disagreeing about the scholarly issues that set off their exchange — but with what sounds like a deeper respect for one another as colleagues.
This has all become a bit heavier than I want it to be. I have no interest in becoming the civility police. Twitter is fun, and funny, and irreverent, and playful, and I want it to stay that way. But I really resist the use of shame as a tool of either humor or criticism. Shame is corrosive to community. It shuts down discussion, rather than opening it up. And that’s my bottom line.
Folks, we need to have a conversation. About Twitter. And generosity. And public shaming.
First let me note that I have been as guilty of what I’m about to describe as anyone. You get irritated by something — something someone said or didn’t say, something that doesn’t work the way you want it to — you toss off a quick complaint, and you link to the offender so that they see it. You’re in a hurry, you’ve only got so much space, and (if you’re being honest with yourself) you’re hoping that your followers will agree with your complaint, or find it funny, or that it will otherwise catch their attention enough to be RT’d.
I’ve done this, probably more times than I want to admit, without even thinking about it. But I’ve also been on the receiving end of this kind of public insult a few times, and I’m here to tell you, it sucks.
I am not going to suggest in what follows that there’s no room for critique, even on Twitter, and that we all ought to just join hands and express our wish for the ability to teach the world to sing. But I do want to argue that there is a significant difference between thoughtful public critique and thoughtless public shaming. And if we don’t know the difference, we — as a community of scholars working together online, whose goals are ostensibly trying to make the world a more thoughtful place — need to figure it out, and fast.
There are two problems working in confluence here, as far as I can tell. One is about technological affordances: Twitter’s odd mixture of intimacy and openness — the feeling that you’re talking to your friends when (usually, at least) anyone could be listening in — combined with the flippancy that often results from enforced, performative brevity too frequently produces a kind of critique that veers toward the snippy, the rude, the ad hominem.
The other problem is academia. As David Damrosch has pointed out in another context, “In anthropological terms, academia is more of a shame culture than a guilt culture.” Damrosch means to indicate that academics are more likely to respond to shame, or the suggestion that they are a bad person, than to guilt, or the indication that they have done a bad thing. And he’s not wrong: we all live with guilt — about blown deadlines or dropped promises — all the time, and we so we eventually become a bit inured to it. But shame — being publicly shown up as having failed, in a way that makes evident that we are failures — gets our attention. That, as Damrosch notes, is something we’ll work to avoid.
And yet, it’s also something that we’re more than willing to dole out to one another. There’s a significant body of research out there — some of my favorite of it comes from Brené Brown — that demonstrates the profound damage that shame does not only to the individual but to all of the kinds of relationships that make up our culture. Not least among that damage is that, while a person who feels guilty often tries to avoid the behavior that produced the feeling, a person who feels shame too often responds by shaming others.
So, we’ve got on the one hand a technology that allows us, if we’re not mindful of how we’re using it, to lash out hastily — and publicly — at other people, for the amusement or derision of our followers, and on the other hand, a culture that too often encourages us to throw off whatever shame we feel by shaming others.
Frankly, I’ve grown a little tired of it. I’ve been withdrawing from Twitter a bit over the last several months, and it’s taken me a while to figure out that this is why. I am feeling frayed by the in-group snark, by the use of Twitter as a first line of often incredibly rude complaints about products or services, by the one-upsmanship and the put-downs. But on the other hand, I find myself missing all of the many positive aspects of the community there — the real generosity, the great sense of humor, the support, the engagement, the liveliness. Those are all way more predominant than the negative stuff, and yet the negative stuff has disproportional impact, looming way larger than it should.
So what I’m hoping is to start a conversation about how we might maximize those positive aspects of Twitter, and move away from the shame culture that it’s gotten tied to. How can we begin to consider whether there are better means of addressing complaints than airing them in public? How can we develop modes of public critique that are rigorous and yet respectful? How can we remain aware that there are people on the other end of those @mentions who are deserving of the same kinds of treatment — and subject to the same kinds of pain — that we are?
This is in part an apology for having ranted and run yesterday; between the little project I’m trying to get launched in the next couple of weeks and a meeting that took up a good chunk of yesterday, I wasn’t able to stay on top of the conversation that my post started for very long.
I’ve tried to catch up on it, though, and have a few thoughts I now want to add.
My hatred of the term is not meant to signal any sense that the thing it’s meant to refer to doesn’t exist. To deny that the dominant logic of contemporary U.S. culture is the logic of the market would be a fruitless exercise. Nor do I want to defend that logic, or suggest that there aren’t real consequences to its dominance.
But I do want to suggest that the logic is so pervasive, and the concept used to describe it so totalizing, that, like “postmodernism” before it, at some point it ceases to have the desired critical effect. As in the case of postmodernism, one has to begin to wonder whether there is any outside to neoliberalism. If there is an outside, how do we get there? If there isn’t, what work is pointing out the water in which we all swim actually attempting to do?
The other problem with the term, and the one that I was mostly focused on yesterday, is its conduciveness to sloppy adoption and deployment. This, too, plagued “postmodernism,” a term that got tossed around like confetti until what descriptive or critical power the term had utterly dissipated. What makes it worse in this case is that “neoliberal” is so clearly meant to be a pejorative, and that it gets deployed in the ways that, as Ted Underwood pointed out on Twitter, “bourgeois” once was. There are times when that term is undoubtedly called for. But like “bourgeois” or “reactionary” or any number of other such terms, I have too often of late heard “neoliberal” deployed as an insult by people on the left against other people on the left. It’s the classic circular firing squad of ideological purity, and it makes me nuts.
I have come to despise the term “neoliberal,” to the extent that I’d really like to see it stricken from academic vocabularies everywhere. It’s less that I have a problem with the actual critique that the term is meant to levy than with the utterly sloppy and nearly always casually derisive way in which the term is of late being thrown about. 1 “Neoliberal” is hardly ever used these days to point to instances of the elevation of market values above all others — it’s used to tar anything that has anything to do with any market realities whatsoever. Which, hello, United States, 2012. Welcome.
So to say, for instance, that the university-in-general is a neoliberal institution is to say precisely nothing. Name me one contemporary institution — seriously, an actual institution — that isn’t. Including every last one of us. None of us got to live in the places we live or study in the places we study or read on the freaking internet without market realities giving us the wherewithal to do so. 2
To say, on the other hand, that some universities are more beholden to market values than others — that some have made a value of the market, to the extent that they bear only the market in mind, and precious little else — and have therefore acquiesced all too willingly to the pressures of neoliberalism, actually might mean something. As it might to say that, for instance, having marketability as our only indicator of the value of scholarship or a scholar’s work represents a neoliberal corruption of the critical project in which we as scholars are ostensibly engaged. But that’s no longer how “neoliberal” is being used, at least in my hearing. It’s instead become a blanket term of dismissal, often aimed at institutions that do not have means of fixing the inequities by which we’re beset, inequities that are way larger than any university, even the university-in-general, can take on without serious support coming from somewhere.
So no more. “Neoliberal” is henceforth dead to me. I will take seriously no more casual statements that toss it around like popcorn, no further arguments that rely on it without any sense of specificity or grounding.
(And as for the tendency to associate anything that involves a computer automatically and of necessity with neoliberalism? Don’t even get me started. 3)
- What’s happened to “neoliberal,” in fact, is not all that different from what happened to “deconstruction,” when it got adopted as a smart-sounding way of saying really close reading. And in this usage, it’s never an invitation to further discussion; it’s a conversation ender, the critique to which there can be no response. ↩
- And to fault the university-in-general for its capitulation to the market when, in the age of state abdication of responsibility for funding higher education, there is literally nowhere else to turn, strikes me as laying blame at entirely the wrong doorstep. Should universities be spaces protected from market values? Yes. Tell me how we get there, while keeping the university running in the process. ↩
- And if you make such an argument while your fingers are resting on the keyboard of a very thin, sleek laptop? Do I need to say the rest? ↩
I find myself at one of those moments at which everything is great and yet nothing seems to be working exactly right. I’ve got an enormous deadline just ahead — not, alas, the “boy, I’m going to blow that deadline and then I’m going to feel sheepish and guilty when I finally send the thing in two weeks late” kind, but the “I will be standing in front of a very large crowd of people unveiling absolutely nothing if this thing doesn’t get done on time” kind. And in fact I think it’s going to get done on time, if we can keep all the little parts working like they’re supposed to. But this weekend a whole bunch of the little parts stopped working. Freaking out may have ensued.
My stress levels, it is needless to say, are through the roof right now. And so Sunday morning, I finally managed — after an altogether alarming number of weeks — to get myself out the door and to a yoga class. And the class was mostly great, and I’m very glad I went, but I had the thing happen afterward where the class managed — I don’t know how else to describe this — to open one of those spots in my body where I shove a whole lot of anxiety and anger and sadness that I don’t want to deal with, and so all of that got released and came flooding to the surface. Needless to say, this is more or less the exact opposite of what I want from yoga.
I’m trying to leave myself open to the possibility, however, that it’s what I need, that exhuming all that negative stuff is a necessary precursor to developing the positive stuff I’m looking for. And so I tried to do the thing that I find so hard: to really let myself feel the anxiety and anger and sadness without either clinging to the feelings or pushing them away.
Saying that I find that hard is an understatement. For one thing, I have a thick streak of Pollyanna in me, one that fairly relentlessly shoves aside anything negative with a rousing internal chorus of “take off that gloomy mask of tragedy; it’s not your style” and other such anthems of indefatigable optimism.1 For another, however, and probably more importantly, I have spent so long as a scholar living in brain-on-a-stick mode — pushing aside all of the claims not just of my body but of my heart as well, in favor of a total acquiescence to the dictates of my head — that I find it really, really hard to actually feel what I am feeling. As soon as I start feeling something, I want more than anything to know what I am feeling, to name it, determine its etiology, decide whether it’s beneficial, and if not, eradicate it as quickly as possible.
Actually living with a feeling long enough to feel it? Unthinkable. Which may precisely be the point.
There’s a deep irony in this, given that I was a most over-emotional adolescent — and that adolescence stretched on longer than I might care to admit. It’s possible that I was referred to as “histrionic” on more than one occasion, and certain very close family members may or may not have compared me to melodramatists of screen and stage. (Often.)
I learned from those family members, of course, not just about what was seemly and what wasn’t, but also what was valued and what wasn’t — and it turned out that the ability to contain your emotions, to condense them into a little knot that can take up residence between your shoulder blades, to push feeling aside in favor of thinking, was a useful skill, professionally speaking. And I discovered that the more I rationalized, the less frequently I was told I was irrational, over-emotional, highstrung. The more, in fact, I was told that I was smart.
I’m now at a crossroads, however, at which I am beginning to wonder whether there might be benefits — I mean, not just personal benefits, but real, actual, professional benefits, benefits for the profession and its relationship to the world — to ending the rational charade, to remember what it felt like to feel things, even to let feeling sometimes take the lead.2 What would it be for academia to cultivate its relationship with its heart just as much as that with its head?3
Perhaps I’m over-generalizing what is in fact a personal, individual issue. But I don’t think so. I am coming to think that many aspects of academic life, from faculty meetings to hiring and promotion processes, including communication both amongst ourselves and with the outside world, would be much improved if we all stopped insisting that everything of value can be thought, if we focused on cultivating an emotional maturity to complement our intellectual maturity. If we weren’t too embarrassed to hit “publish” on a post that starts like this one, that’s so personal as to be all about how I feel.
- Yes, I realize I’m mixing my texts there. In reviewing this post, however, I found myself struck by the degree to which the American musical theater and related films of the mid-twentieth century were apparently produced by that annoying guy on the street who cannot refrain from saying “smile, beautiful! It’s not that bad!” I have been fully interpellated by the “sunny side of the street” authorities, and yet I still want to punch that guy and shout “no, it is precisely that bad.” ↩
- My previous footnote points to one of the obvious challenges: the degree to which American feelings have been manipulated by, and often surrendered to, its popular culture. It’s in this vein that undergraduates often complain about classes in English or media studies draining all of the pleasure out of their objects; it’s only in a rational exploration of those pleasures that we’re able to see how they’re constructed — but once we see it, it often takes a lot more work before we can get back to untrammeled enjoyment. ↩
- And so, in the spirit of the previous footnote, what would it be to acknowledge that even the debased, manipulated feelings generated by popular culture are in fact feelings, and that while they need to be separated from meanings, they nonetheless carry a profound importance for the ways contemporary culture does and doesn’t — especially doesn’t — function? Am I once again shoving away heart in favor of head if I wonder what we might learn by really listening to the heart at its most irrational? ↩