Summer Plans: The Risks and Rewards of Reviews

The jet lag has lifted and I’m settling back into my routines after my trip to Vancouver–my first real vacation away since July 2015. And even so, it was hard to keep work obligations entirely at bay: a very late paper arrived at 10 p.m. the night before I left and had to be dealt with a.s.a.p.; proofs for a forthcoming review appeared in my inbox a few days along and threw me into a panic until I got reassurance that the corrections could wait until I got back; and a book for another review was my reading material on my way home–although that was my decision, and the book in question (Adam Sternbergh’s The Blinds) isn’t particularly hard work. I don’t really mind: porous boundaries are a small price to pay for the autonomy and flexibility I enjoy at this stage of my career, and there was certainly plenty of work-related business I simply ignored until today, when the Victoria Day holiday too is past.

Now that it is today, though, it’s time to get sorted for the summer. As previously mentioned, my first task is sort of a meta-project, in which this post is a very preliminary step: I want to take some dedicated time to plot out a more deliberate trajectory than I have followed for the last couple of years. It’s not that I’m dissatisfied with what I’ve accomplished: despite the still-embittering lessons of my promotion denial, I have no regrets or second thoughts about where I have been putting my energy or how I have been using my expertise. I certainly have at no point since the bad news felt inclined to rededicate myself to conventional academic publishing. I don’t set myself against it as an enterprise in toto, and I might yet decide that a project I’m interested in is best suited to publication in that form for that audience, but I have long believed that we produce not only enough of such scholarship but too much of it–too much too fast, at any rate, for us to keep up with it ourselves, or to assert its value with any confidence–and so as a profession we can and should spare some of our “HQP” to go and do otherwise.

My version of “otherwise” has so far included a range of essays on Victorian fiction aimed at a non-specialist audience (though not, I have always hoped and often found, lacking in interest for specialists as well); a website and e-book of supporting materials for book clubs reading Middlemarch; this blog, which includes commentary on academia and especially on teaching along with its posts on books and literary culture; and a fair number of book reviews in a widening array of venues. One of the things I’m specifically thinking about right now is what, if any, parameters to set on that last category, especially because for the last year or so I have pretty much always had at least one review underway at all times, and when work is otherwise busy that’s about as much “extra” attentive reading and writing as I can manage. Given that even short reviews still take me several concentrated days, I could almost certainly fill up most of this summer with them if I accepted or sought out all the possible opportunities — but should I?

One reasonable answer is, “Why not?” One pragmatic reason to review as much as I can in as many publications as will have me is that doing so builds both my skills and my “brand” as a reviewer. I get valuable experience, and I gain the kind of credibility as a critic that my academic resume does not earn me outside the ivory tower. At least as important–maybe more–is that I really like the work. It is more intellectually stimulating than I would have thought before I tried it, and more creative: for every book you have to find the story to tell, the tilt to hold it at so you can see it clearly but by your own lights. The different genres of reviewing add a further challenge: the more expansive 2000 (or more) word review-essay we typically run at Open Letters Monthly makes different demands, and allows for different kinds of fun, than a more pointed review of 300, or 700, or even 1000 words. I have already learned a lot about both books and criticism from practicing in these different forms, and I enjoy feeling that I’m getting better at it. (I have also learned even greater respect for those who do it much more frequently and fluently than I!) 

I also like the scale and scope of the work. Each assignment (whether I choose it myself or it is set by another editor) comes with known parameters and a deadline, a finite structure that suits my temperament. There can certainly be stress involved, especially before I know what my angle will be and then as I try to shape my ideas into my allotted space in a way that satisfies me and doesn’t (to my eyes, at least) sacrifice nuance or particularity. As I get more experience, however, my confidence grows, so that now I recognize those messy earlier stages as a necessary phase before I chip away and refine, leaving something as clear and expressive as I can make it. There’s a lot of satisfaction in successfully completing a piece of writing with such a specific mission and then moving along to the next one.

I have also appreciated the way reviewing has expanded my reading, particularly when the books are suggested by other editors rather than hand-picked by me to suit my own known tastes and sensibilities. I would point, for example, to the increase in Canadian titles I have read since taking on some commissions for Quill & Quire and, more recently, Canadian Notes and Queries, though the best example of a writer I would probably never have discovered on my own but loved would be David Constantine. Here, however, is also where the advantages of reviewing shade into the disadvantages: for every David Constantine or Danielle Dutton or Sarah Moss, there’s another writer whose books I would not be bereft to have missed — though of course you can’t know that until you’ve tried them. “Most books aren’t very good,” one experienced reviewer once said to me, and now that I do more reading on demand (though not nearly as much as he does!) and somewhat less just for myself, I understand much better what he meant. There’s a certain resignation every full-time reviewer must feel on opening up the next cover without any expectation of greatness. Of course, that makes it all the more delightful when a book exceeds expectations — which in turn probably accounts for the effusive praise books that are pretty good but not that good sometimes seem to get. For a reviewer who reads, perforce, a lot of mediocre titles, the relief no doubt results in some disproportionate enthusiasm.

So one risk of doing more reviewing is having to read a fair number of books that may not be that good or may not really reward the effort it takes to say something interesting about them. This is not the case when working with George Eliot, whose worst books are still more worthwhile than many writers’ best. Another risk is that the temptation of doing these neatly finite pieces makes it harder to commit to longer-term or more open-ended ones: the immediacy of the next deadline becomes the perfect excuse for putting off what might be harder but ultimately richer writing projects. I said before that I would like to get back to writing more essays–I don’t mean just reviews that are more essayistic, but essays that range and explore literary ideas in a different way. I would like to push my limits and increase my fluency in that genre as well, but I feel as if I have lost my nerve when it comes to proceeding towards an idea that isn’t justified by a specific occasion, such as “here’s a new book,” or framed by a pre-set task and word limit. What could I or should I try to write about? A likely genre for me to pursue here is the literary profile, but I’ve had trouble focusing on a topic, so that’s one thing I’ll be thinking about during my planning period. Another common kind of literary essay is a pitch for the “underappreciated” novel or novelist. I griped a bit on Twitter about what I see as the “literary hipsterism” of this approach, but that needn’t be the tone, and in fact all of the ‘Second Glance’ pieces I’ve written for Open Letters are in this spirit but don’t (I hope) suggest I’m preening because I think I’m particularly cool to know about them! 

But essays too are, in the end, small scale projects. Should I be aspiring to something on a larger scale? In the academic humanities, books are by far the most valued form; I’ve questioned the assumption that they should be, especially under current circumstances, and though I have watched with a bit of envy as some of the online writers I’ve followed for some time have published books that look really great, I do still feel that you should write a book if you have a book to write–something that needs and deserves a more expansive treatment–not as an end in itself. How do you know if you have a book in you, though? Or, how do you know what kind of book you might have in you, or already have begun without realizing it? More than once here  I’ve brought up the possibility of a book that is actually a collection of smaller parts (revised versions of my essays on George Eliot, for instance). I have spent a lot of time on that idea before, including on my last sabbatical, and I even wrote a draft introduction. My work on that project stalled, for various reasons, but perhaps it’s time I took it further. Here, then, is something else I’ll be reflecting on.

In the meantime, I have the Sternbergh review to do, and Sarah Perry’s The Essex Serpent, which I committed to write up for OLM, has just arrived and looks mighty tempting. And I just said yes to another editor for a June deadline. I’m looking forward to doing all of these, but I need to make up my mind how many more I can do if I still want something else to show for my summer. If

The Price We Pay: Brian McCrea, Addison and Steele Are Dead

mcrea-not-the-coverFrom the Novel Readings Archives: I still find myself thinking a lot about the questions raised by Brian McCrea’s book Addison and Steele Are Dead, which I wrote about during my first year of blogging. Apparently I’m in something of a minority, or presumably I’d be able to find the actual cover image online somewhere! But rereading this post nearly a decade later, McCrea’s theory about the relationship between literature, professionalism, and teaching still seems well worth considering.


In parallel to my reading of ‘books about books’ aimed at non-specialist readers, I have been reading scholarly books that treat the development of English studies and/or academic criticism in historical as well as theoretical contexts. (Examples include John Gross’s The Rise and Fall of the Man of Letters, Morris Dickstein’s Double Agent: The Critic and Society, and Geoffrey Hartman’s Minor Prophecies: The Literary Essay in the Culture Wars. My notes on these have been largely maintained off-line, though my post on Denis Donoghue’s The Practice of Reading comes out of the same line of research.) All of these books (and many more like them, of course) make explicit that what now appear to be the “givens” of professional literary criticism and the discipline of English studies are highly contingent and far from exempt from scrutiny, evaluation, or (presumably) further development.

McCrea’s Addison and Steele Are Dead: The English Department, Its Canon, and the Professionalization of Literary Criticism (1990) is certainly among the more lively and provocative books in this collection. As his title suggests, McCrea frames his consideration of English departments as professional and institutional spaces with arguments about what features in the work of Addison and Steele “render it useless to critics housed in English departments”–not, as he is quick to add, that “their works are without value, but rather, that they are not amenable to certain procedures that English professors must perform.” The opening sections of the book look first at the express intentions of Addison and Steele as critics and men of letters, particularly at their desire to be popular, widely read, accessible, un-mysterious. The short version of his story is that professional critics require difficult, complex, ambiguous texts to do their jobs; the “techniques of simplicity” that characterize Addison and Steele propel them, as a result, out of the canon. (McCrea reports that the last PMLA essay on Addison or Steele appeared in 1957, and that Eighteenth-Century Studies, “the publication of choice for the best and brightest in the field,” published only two short pieces on them in 20 years.) (As an aside, I wonder if a similar argument could be made about Trollope, whose novels often seem difficult to handle using our usual critical tools.)

spectatorAs he develops his argument, McCrea offers an interesting overview of the 19th-century and then 20th-century critical reception of Addison and Steele. He explains the Victorians’ admiration for these 18th-century predecessors largely in terms of the different understanding that prevailed about the relationship of literature, and thus of the literary critic, to life. Rightly, I’d say (based on my own work on 19th-century literary criticism), he sees as a central Victorian critical premise that literature and criticism are public activities, that their worth is to be discussed in terms of effects on readers; hence the significance attached, he argues, to sincerity as well as affect. Especially key to McCrea’s larger argument is his observation that the 19th-century writers were not “academicians” or “specialists in a field”:

For Thackeray and his contemporaries, literature is a public matter, a matter to be lectured upon before large audiences, a matter to be given importance because of its impact upon morals and emotions. For the present-day academic critic, literature no longer is a public matter but rather is a professional matter, even more narrowly, a departmental matter. The study of literature has become a special and separate discipline–housed in colleges of arts and sciences along with other special and separate disciplines. The public has narrowed to a group of frequently recalcitrant students whose need for instruction in English composition–not in English literature–justifies the existence of the English department.

As McCrea tells the story (which in its basic outlines is pretty similar to that told in other histories of criticism) this decline in the critic’s public role has had both significant costs (among them, the critical ‘death’ of Addison and Steele) and significant benefits. At times the book has a nostalgic, even elegaic sound:

People who want to become English professors do so because, at one point in their lives, they found reading a story, poem, or play to be an emotionally rewarding experience. They somehow, someway were touched by what they read. Yet it is precisely this emotional response that the would-be professor must give up. Of course, the professor can and should have those feelings in private, but publicly, as a teacher or publisher, the professor must talk about the text in nonemotional, largely technical terms. No one ever won a National Endowment for the Humanities grant by weeping copiously for Little Nell, and no one will get tenure in a major department by sharing his powerful feelings about Housman’s Shropshire Lad with the full professors.

mcgowanWhile we can all share a shudder at the very idea, to me one strength of McCrea’s discussion is his admission that marginalizing affect, pleasure, and aesthetic response is, in a way, to be untrue to literature, and that the professional insistence on doing so also, as a result, marginalizes our conversation, alienating us, as McCrea says, “from our students, our counterparts in other academic departments, our families [unless, he allows, they include other professional critics–otherwise, as he points out, even they are unlikely to actually read our books and articles], and, ultimately, any larger public.” (In Democracy’s Children, John McGowan makes a similar point: “There remains a tension between the experience of reading literature and the paths followed in studying. . . . To give one’s allegiance to the academic forms through which literature is discussed and taught is to withdraw [at least partly] allegiance to literature itself”).

But why, McCrea goes on to consider, should we expect such cross-over between our work–our professional lives and discourse–and our personal lives? McCrea’s answer to this question (we shouldn’t) puts the professionalization of English studies into the context of professionalization more generally, which he argues (drawing on sociological studies) was a key feature of American society during the last half of the 20th century. Perhaps the most distinctive feature of McCrea’s book, in fact, seems to me to be his insistence that, in this respect at least, ‘professing English’ is (or has now become) just another job, and indeed that its success at establishing itself professionally at once accounts for and has depended on its investment in theory and metacommentary: “The ultimate step in the aggrandizement of any professional group is for its members to get paid to talk about how they do what they do rather than doing it.” If one result is isolation from and (perceived) irrelevance to the broader public, including the reading public, the gains for criticism and even for literature are also, McCrea argues, substantial:

Rotarians no longer look to us for uplift, future presidents no longer turn to us to increase their ‘stock of ideas,’ nor do ex-presidents attend our funerals, undergraduates no longer found alumni associations around us, family members can no longer read our books, and plain English has disappeared from our journals. But professionalization has liberated us from a cruel Darwinian system in which one white, Anglo-Saxon, Protestant male emerged at the top while others struggled at the bottom, grading papers in impoverished anonymity. It has liberated us from the harsh economic realities of eighteenth-century literature . . . while [today’s critics] might wish to share Steele’s influence, I doubt they would want to share his life. He practiced criticism in a world in which there was no tenure, a world devoid of university presses, National Endowments for the Humanities, and endowed university chairs in literature. . . .

In a society in which no one outside the classroom reads Pope, professors can earn handsome incomes by being Pope experts. The five top Pope experts compete with each other, but probably not with the Tennyson experts, and certainly not with the Chaucer experts. The quest for autonomy has cost us Addison and Steele, has cost us the ability to treat literature as a public, moral, emotional phenomenon. But it has left us with a part of literature, with a canon of works complicated in their technique and tone, and with a classroom in which we have a chance to teach those works, to keep them (and whatever value they hold) alive.

Provocative, as I said, not least in reversing the oft-heard line that (undergraduate) teaching is the price professors pay for the opportunity to do their research and as much as declaring that, to the contrary, academic criticism is the price they pay to preserve literature and its values.

Originally published in Novel Readings August 8, 2007.

This Week In My Classes: Poetry and Prose

That was a busy week! Not only was it the first full week of term, with both classes and committee meetings, but I was involved in a Ph.D. comprehensive exam, which is something we usually do when classes aren’t in session. Obviously it’s the student who has the biggest job, but the committee has to read the written papers and prepare questions for the oral exam. Happily, it went well (congratulations, Laura!), and next week things should settle into more of a routine.

howe-close-readingIn Close Reading I always start with poetry, partly because it’s just easier to model and practice mining details for meaning when working with shorter, denser texts. Even in Middlemarch (don’t tell anyone I said this!) there are places it’s probably okay not to scrutinize every word, but a sonnet such as Robert Frost’s “Design” demands our unrelenting attention. I reviewed some key terminology on Monday, and then Wednesday and Friday were all about scansion, something think is not just vital (who can talk well about poetry without considering rhythm?) but kind of fun. However, despite my best efforts, I am almost never able to convince the majority of my students that it is anything but aggravating: the stress was palpable in both tutorials on Friday!

One of the problems, of course, is that while there are things you can do wrong, there isn’t just one right result: you need to use your ear and your judgment (which in turn relies on your understanding of the whole poem, including both form and themes). As far as possible, I try to shore up their confidence by proposing methodical steps to follow: be sure you are pronouncing words correctly; mark in stressed and unstressed syllables first where you do not have any choice (it’s never spi-DER, it’s always SPI-der); at least initially, assume little words aren’t strong beats but nouns are; wait until you’ve done several lines before deciding what pattern you see, because good poets like rhythmic variation. Ultimately, though, you do have to rise to the occasion of the poem itself and make some decisions about how you think it is best read. Sometimes a poem steers you towards a more regular (and thus possibly more artificial “poetic”) rhythm, with a strong predictable beat that isn’t necessarily how you would “naturally” speak its sentences (Poe’s “The Raven,” to me at least, works this way), while other times a poem demands to be read dramatically.

donnepoemsI almost always end up using lines from Donne’s Holy Sonnet X (“Death, Be Not Proud”) to illustrate just how interesting, important, and even exciting scanning poetry can be. For one thing, it’s a poem that quickly teaches you not to read it in anything like mechanical iambic pentameter: “Death, BE not PROUD, though SOME have CALLed THEE / mighTY and DREADful, FOR thou ART not SO”? You wouldn’t. You mustn’t. And not just because that’s not how you pronounce “mighty.” You’re standing up to Death! At the very least, you have to call him out in that first syllable: “DEATH, BE not PROUD.” You might even do four stresses in a row — “DEATH, BE NOT PROUD” — or maybe that’s too much. I’m tempted to do “for THOU ART NOT SO” as well, but my reading of the poem may be more confrontational than others would like. At any rate, you have to say it as if you mean it, which makes scanning the poem actually quite a profound exercise:

In The Victorian ‘Woman Question’ we read Frances Power Cobbe’s 1868 essay “Criminals, Idiots, Women, and Minors,” a powerful attack on the irrational and unjust laws governing married women’s property, along with Margaret Oliphant’s 1858 essay “The Condition of Women,” in which she wonders why women are complaining so much (we agreed that “don’t young men have it pretty tough too, with all their college degrees but no clear vocation?” is not her most compelling argument!). And we read J. S. Mill’s The Subjection of Women (1869), which of course is a classic text in the development of liberal feminism. It is always interesting to see how strikingly modern it can sound (on this reread, I was particularly interested in Mill’s discussion of unearned male privilege and its deleterious moral effects) even as it betrays its Victorianism in other moments (for instance, in Mill’s comments that left to themselves, women will almost certainly still end up choosing marriage and motherhood over other options, and that the domestic arrangements of the household make pretty good sense as they are).

My main goal with these early readings is to start us off with a sense of some of the Victorian debates about women, including idealistic notions of their angelic influence and delicate sensibilities (with all the pit-and-pedestal consequences of that view) as well as contrary views and arguments for their rights and abilities. This lets us put the arguments we’ll encounter in our novels and poems, which are often put less directly — dramatized rather than theorized — into their contemporary contexts. Next week it’s Anne Bronte’s The Tenant of Wildfell Hall, for example, which will show (among other things) that the idea of women’s influence is just that, an idea, one that means very little compared to the overt power of a man determined to have his own way.

Innovation and the Eye of the Beholder

Untitled-1On university campuses we hear a lot about innovation these days, from hype about the latest ed-tech fad to proclamations by institutions like my own about fostering a “culture of innovation.” This has got me reflecting on how we define or recognize innovation — something that is not as obvious, I think, as its champions, or as those who insist on it as a measure of academic success, typically seem to assume. In some fields, of course, it’s easy enough to tell when something is new, if it shifts or breaks a paradigm. But in others, context makes all the difference, as my own chequered career as a “thought leader” demonstrates.

Exhibit A: my undergraduate degree. When I first started at UBC in 1986, I intended to major in history. I was an avid reader, but it had never occurred to me to study reading. I changed my mind, obviously, thanks in large part to my first-year English professor, Don Stephens. (This is one reason I try never to underestimate the importance of our own first-year classes. They can literally change lives.) I didn’t want to give up history, though, and so I asked if it would be possible for me to do my Honors degree in both departments. It turned out that until then, nobody had done a combined English-History Honors degree, so the logistics all had to be specially worked out. (This was ultimately done by the simple method of adding up the key requirements, so that, for instance, instead of the 3-credit English Honors essay or the 6-credit History Honors essay, I did a 9-credit essay, with double the usual number of supervisors, readers, and examiners. I ultimately defended it to a panel of 7 professors.) Administratively, this was innovative, then — but intellectually, the work I did was very much in line with current trends in both disciplines.

UBClogoToday, of course, an interdisciplinary degree is wholly unremarkable; Dalhousie even has an entire Interdisciplinary Ph.D. program (for which I have done one supervision myself). Even by the time I got to Cornell to pursue my own Ph.D. in English, though, nobody raised an eyebrow at my interest in historiography. In retrospect, I think my role as an innovator actually reflected less on me than on the somewhat fusty assumptions governing UBC’s degree requirements at the time — particularly in History, where I met the most skepticism about my proposal, but also in English, where the Honors program still required one course each in Chaucer, Shakespeare, and Milton.

Exhibit B: my feminism. In my undergraduate history seminars, I was something of a feminist agitator. I particularly remember the efforts my friend Helen and I made to get some scholarship about gender onto the reading lists. We were unsuccessful in our mandatory historiography seminar — I remember one male student pushing his chair back from the table and exclaiming in disgust “But you’re trying to change something in your culture!!” Well, yes, we were: in our wider culture and in our immediate academic culture, in which the male students thought it was pretty funny to see if they could get us (“the feminists”!) riled up. But we were successful in our Renaissance history seminar: I still recall with admiration (and some self-satisfaction) the professor’s comments to the class at the end of the term that he was glad we had pushed for readings like Joan Kelly’s “Did Women Have a Renaissance?” because they had prompted him to reconsider some of his own working assumptions. That’s integrity! And our interventions were clearly innovative: we were very cutting edge!

06-vintage_cornell_souvenir_penantBut when I got to Cornell, I discovered that far from being a radical, I was actually a conservative! It turned out that there were some kinds of questions you couldn’t safely ask there, arguments you couldn’t seriously entertain, without undermining your feminist credentials. My first big mistake was giving a seminar paper called “The Madwoman in the Closet”: it queried some then-dominant trends in feminist criticism, particularly in 19th-century studies, and tried (perhaps crudely, but I was a beginner at all of this — and frankly, my somewhat old-fashioned training at UBC had not prepared me well for it) to figure out how politics and aesthetics were getting balanced (unbalanced, I thought, maybe, possibly) in the debates. My professor was keen to have these discussions, but said to me quite frankly that he felt that as a male professor, he couldn’t raise these questions. So I blundered in, and paid the price. I also wrote a more or less positive review of Christina Hoff Sommers’ Who Stole Feminism — I strongly doubt I would write the same review today, but I distinctly remember how scrupulous I tried to be, looking up the statistics and studies she cited and trying to think my way through the arguments she made. As I recall, this review (the first one I ever published!) was far from a cheerleading piece — it was more in the spirit of “these seem like questions worth asking” — but it can’t have done my developing reputation as an ideological throwback any good.

Yet at Dalhousie, gender issues have always been central to my teaching (as they have been to my scholarship) — I’ve even had at least one student complain that I was “pushing feminism down our throats.” More positively, I have had many appreciative comments from students, including one this year who said mine was the first class she’d taken in which “social justice” issues including feminism were simply integrated into the curriculum, even though the course itself wasn’t labelled as a class in “women’s studies.” It’s impossible not to wonder how much I have actually changed, and how much it’s just the shifting contexts around me that make me look different.

TLS-soganExhibit C: my critical writing. There are many possible angles to consider here, but I’ll focus on my recent work outside of academic publishing, because its status has been much on my mind lately. In a way, the kind of criticism I’ve been doing recently — from book reviews to literary essays — is not innovative at all: it’s the same kind of work everyone else is doing who also writes for newspapers and magazines and literary journals. But from an academic perspective, to be writing for those venues instead of for academic journals is itself innovative: it’s the kind of thing that gets called “knowledge mobilization” or “knowledge dissemination” or “public humanities.” Except that some of these publishing ventures resemble (in style, not necessarily in content) an older kind of literary criticism — a kind some might call belles lettres — which is now considered passé in academic circles. So my recent work could be considered retrograde, not innovative. Except that to break from the conventions of academic writing and try to replicate the best qualities of belles lettres (fine, smart, accessible writing, with its own literary elegance) while still doing criticism informed by decades of academic scholarship … couldn’t that combination of new insights and old forms itself be innovative? Then, what about the content of the reviews and essays? Every new interpretation of a literary text is a critical innovation, isn’t it? So every review of a new book, representing a new intellectual encounter, is intrinsically ground-breaking, even if book reviewing as a form is the oldest kind of literary criticism. What if you make a new critical argument, based on original research, but in an essay outside the norms of academic publishing — if that argument falls in the forest, can anybody hear the innovation? Or what if the argument of an essay is new to one audience but not to another? What is going on then?? Am I doing original work or not???

Oops. That last part possibly got away from me a little! But I think you get my point: determining whether something — an interpretation, an argument, a curriculum, a research project, a work of criticism — is innovative, new, original is not always straightforward. It depends on definitions, expectations, and above all, on contexts. The “flipped classroom” is nothing new to English professors who for years have been assigning texts to be read outside of class and using class time for discussion. “Student-centered learning” is no great revelation in disciplines that have always been based on Socratic exchanges, held seminar classes, and taught students to develop their own essay ideas into original arguments based on their own research. But that these are old practices in some contexts does not mean they aren’t valuable ones, or that people shouldn’t try them in other contexts, if they seem promising there. What matters should not be innovation for its own sake: we should stop fetishizing it as an end in itself, as if either its definition or its importance is self-evident. I’m not against innovation — of course not! And we should certainly encourage and support people who risk doing something outside their immediate limiting norms because they think it will serve the university’s mission — because we shouldn’t want what is now to be mistaken for what should  always be, or always was, in any context. It’s just strange to me how absolutely the term “innovation” is used, how confidently it gets invoked — and how, ironically, it can actually be used to reinforce orthodoxies if we never double-check our assumptions about it.

What Price Genius? Helen DeWitt, The Last Samurai

samuraiGreat news: New Directions is putting out a new edition of Helen DeWitt’s The Last Samurai, which is without a doubt one of the best, most surprising, and most moving novels I’ve read in the last decade or more. I’m excited to reread it when it appears in all its finery. In the meantime, here’s what I wrote about it when I read it for the first time.


The Last Samurai is the story of a single mother, Sybilla, and her son, whom she calls “Ludo”–though on his birth certificate it says either ‘David’ or ‘Stephen,’ ‘one or the other.’ It makes sense that Sybilla would consider it pointless to be certain, because one of the things this novel is about is precisely how we figure out and then live up to who we think we are. It’s also about the accidents that determine the lives we lead, regardless of who we might be, and about the choices and values and loves and hates and languages and books and ideas and music and art and movies and people that constitute those lives and make them worth living–or not. It’s a celebration of genius and an attack on mediocrity, a paean to the human capacity to create and learn and think and reason and a lament for the seductions of banality. It’s about quests and heroes and, of course samurai. Its parade of erudition is at once dazzling and surprisingly entertaining, and also inspiring, because it’s in the service of intellectual curiosity and love of knowledge, not accomplishment or grades or prizes. It’s Ludo’s curiosity, in particular, that gives the novel its momentum: he is a child prodigy whose brilliance at once thrills and terrifies his mother. Ludo’s voice, and his quest for his father, eventually take over the novel from Sybilla, but she remains its presiding genius; without her, Ludo’s endless questions would go unanswered. Though their relationship is never sentimental (indeed, they rarely seem like parent and child, at least in the ways we would casually expect), their attempts to care for each other have an emotional intensity and an intellectual integrity that are ultimately very moving. A book so extravantly episodic and allusive risks losing its humanity. Somehow, miraculously, for all its jouissance, all its postmodern display, The Last Samurai never does.

This is a novel that feels exceptionally difficult (and more than usually pointless) to excerpt from–and yet, the temptation! And it incorporates so much that it’s difficult to know what to single out for commentary. One aspect of it that is obviously very important, both structurally and thematically, is its engagement with Kurosawa’s The Seven Samurai (which I have never seen–but the range of things alluded to in this novel that I don’t know first-hand is so long there’s no point remarking them all). The Seven Samurai is Sybilla’s favourite film. Not only does she watch it over and over, but she thinks of it as taking the place of a male role model in Ludo’s life. What she doesn’t expect, when she first shows it to him (when he’s five) is that it will prompt him to demand to learn Japanese.

L: When areyou going to teach me Japanese?

I: I don’t know enough to teach you.

L: You could teach me what you know.

I: [NO NO NO NO NO] Well

L: Please

I: Well

L: Please

Voice of Sweet Reason: You’ve started so many other things I think you should work on them more before you start something new.

L: How much more?

I: Well

L: How much more?

The last thing I want is to be teaching a five-year-old a language I have not yet succeeded in teaching myself.

I: I’ll think about it. . . .

Her problem is that Ludo is urgent with his demands to learn, not just Japanese, but Latin and Greek and much much more, and that there isn’t, really, any reason not to teach him whatever he wants to know except the widespread (mis)understanding that he is too young for this kind of thing–a view they encounter over and over as they ride the Circle Line to keep warm:

. . . he has been reading the Odyssey enough for a straw poll of Circle Line opinion on the subject of small children & Greek.

Amazing: 7

Far too young: 10

Only pretending to read it: 6

Excellent idea as etymology so helpful for spelling: 19

Excellent idea as inflected languages so helpful for computer programming: 8

Excellent idea as classics indispensable for understanding of English literature: 7

Excellent idea as Greek so helpful for reading New Testament, came through eye of needle for example mistranslation of very simple word for rope: 3

Terrible idea as study of classical languages embedded in education system productive of divisive society: 5

Terrible idea as overemphasis on study of dead languages directly responsible for neglect of sciences and industrial decline and uncompetitiveness of Britain: 10

Stupid idea as he should be playing football: 1

Stupid idea as he should be studying Hebrew & learning about his Jewish heritage: 1

Marvellous idea as spelling and grammar not taught in schools: 24

(Respondents: 35; Abstentions: 1,000?)

Oh, & almost forgot:

Marvellous idea as Homer so marvellous in Greek: 0

Marvellous idea as Greek such a marvellous language: 0

 What place genius, what price genius, in a world like this? These are among the difficult questions Sybilla faces, as she reads about the education (and eventual breakdown) of John Stuart Mill, or about “the example of Mr. Ma (father of the famous cellist).”

samurai (1)One of the most fascinating explorations of this in the novel is the story of the pianist Kenzo Yamamoto, who becomes obsessed, not with how to play a particular note or phrase or piece, but with how else you could play it, or how else it could sound:

Yamamoto: To put it another way, let’s just take a little phrase on the piano, it sounds one way if you’ve just heard a big drum and another way if you’ve heard a gourd and another way if you’ve heard the phrase on another instrument and another way again if you’ve just heard nothing at all–there are all kinds of ways you can hear the same sound. And then, if you’re practising, you hear a phrase differently depending on how you’ve just played it, you might play it twenty or thirty different ways and what it actually is at any time depends on those things it might be–

He gives a disastrous concert at Wigmore Hall in which he played “about 20 minutes of drum music after each of six [Chopin] Mazurkas . . . with the result that the concert ended at 2:30 in the morning & people missed their trains & were unhappy.” Sybilla takes Ludo to hear Yamamoto in concert at the Royal Festival Hall. The first half is uneventful, but after the interval, Yamamoto begins to play the Brahms Ballade Op. 10 No. 1, first just phrases and then eventually the whole piece:

For the next seven and a half hours Yamamoto played Op. 10 No. 1 in D minor, and sometimes he seemed to play it exactly the same five times running but next to the sound of a bell or an electric drill or once even a bagpipe and sometimes he played it one way next to one thing and another way next to another. . . .

Eventually he plays it through nine times along with a tape of traffic and footsteps, then when the tape stops and there is silence he plays it “so that you heard it after and over the silence.” Then, after all those hours playing Op. 10 No. 1, the audience is “shocked to hear in quick succession Op. 10 No. 2 in D major, Op. 10 No. 3 in B minor and Op. 10 No. 4 in B major, and you only heard them once each”:

It was as if after the illusion that you could have a thing 500 ways without giving up one he said No, there is only one chance at life once gone it is gone for good you must seize the moment before it goes, tears were streaming down my face as I heard these three pieces each with just one chance of being heard if there was a mistake then the piece was played just once with a mistake if there was some other way to play the piece you heard what you heard and it was time to go home.

Her bitterness at the inadequacies of the Circle Line riders is balanced by this moment of grace. Why do we put such limits, not just on our children, but on our art? Much, much later in the novel, Yamamoto says to Ludo, “When you play a piece of music there are so many different ways you could play it. You keep asking yourself what if. You try this and you say but what if and you try that. When you buy a CD you get one answer to the question. You never get the what if.” There’s no place for Yamamoto’s “what if” in the world of concert halls and recording studios and trains to catch.

The risk DeWitt takes is that this dedication to the highest possible forms becomes, or at least will come across as, sheer elitism, a blunt attack on popular taste. About a third of the way through the novel, pestered endlessly by Ludo for the name of his father, Sybilla presents him with a challenge: she gives him a tape of Liberace, a drawing by Lord Leighton, and a magazine article and tells him “You will not be ready to know your father until you can see what’s wrong with these things.” More than that,

Even when you see what’s wrong you won’t really be ready. You should not know your father when you have learnt to despise the people who have made these things. Perhaps it would be all right when you have learnt to pity them, or if there is some state of grace beyond pity when you have reached that state.

As Ludo takes over as the novel’s narrator and the plot (to the extent that it is linear) becomes the story of his attempt to find (or choose) his father, this quest to discern the failings of Liberace (which is, not incidentally, also  the code name Sybilla uses for Ludo’s father), of Lord Leighton, and of the boring magazine article runs in parallel. I wasn’t sure I wanted Ludo to grow up into another Sybilla, or even to pass her test–Sybilla herself does not live happily or  easily with her ideas, after all–and yet the whole book pits itself against relaxing into easy compromises, whether moral or ethical or aesthetic (and I’m not sure that the novel allows for a distinction between these). There’s nothing easy about Ludo’s progress towards the novel’s conclusion, but I think that through each of his encounters with potential fathers, he learns and grows in ways that eventually exceed what Sybilla wanted, or even thought was possible, for him.

There’s much more to The Last Samurai than this, but if I started listing off more of its ingredients it would make the novel sound like a kind of flamboyant bricolage rather than the gratifyingly readerly treat it is.

Originally published at Novel Readings October 26, 2011.

On Having and Earning Critical Authority

IMG_3141I don’t want to leave the impression that frustration with the rigidity of academic practices is all I took away from my Louisville conference experience. There was definitely value for me in the work I put into my own paper, as well as in hearing and discussing the papers my co-panelists presented. So I thought I’d follow up my previous post with a sketch of the questions I went to Louisville to talk about.

My paper was called “Book Blogging and the Crisis of Critical Authority.” During the discussion after our papers, all of the panelists agreed that things have died down since the days when you could hardly turn around without seeing yet another “bloggers ruin everything” article. A few diehards still take every opportunity to decry the temerity of feckless amateurs who think they can just go online, say whatever they want, and call themselves “critics” (I’m looking at you, William Giraldi), but by and large (as Dan’s paper convincingly argued) the success of many serious web magazines has proven that online criticism can be as good as if not better than its old media competition, and book blogs in all their idiosyncratic variety are now a familiar, if not always respected, feature of the critical landscape.

Daniel Mendelsohn’s conspicuously temperate “Critic’s Manifesto” was one sign of the changing times; in it he acknowledged (as so many of his professional colleagues would not) the existence of “serious longform review-essays by deeply committed lit bloggers.” Mendelsohn did still conclude that “everyone is not a critic”; he cites “expertise and authority” as crucial qualifications (“knowledge … was clearly the crucial foundation of the judgment to come”) along with a more ineffable quality that he sums up as “taste” (“whatever it was in the critic’s temperament or intellect or personality that the work in question worked on“). Though he concedes that the requisite knowledge does not depend on formal credentials such as Ph.D.s, he does ultimately describe the critic’s job as being “to educate and edify” — so, it’s still a top-down or hierarchical model.

ao scottMendelsohn’s article was one of the sources I cited in my paper, in which I explored some questions about what we mean by “critical authority.” As he notes, once you move outside the academy degrees are neither a necessary nor a sufficient measure of the relevant expertise. But it’s not easy to pin down what does count, how authority is established, especially in a field of inquiry where there are no sure or absolute standards of judgment. Literary critics know that their authority is unstable because the history of criticism teaches us how judgments change over time, while simple experience shows us how much they differ among individuals. We can call variant assessments “gaffes” or “errors in individual taste,” as Mendelsohn does in his recent New York Times review of A. O. Scott’s Better Living Through Criticism, but he can’t actually prove that “early and wince-inducing takedowns of John Keats’s poetry, [or] of “Moby-Dick” are flat-out wrong any more than I could convince my Modernist colleagues that George Eliot is objectively a better novelist than James Joyce. Still, the rhetoric of criticism as well as its traditional methods of delivery typically seek at least the appearance of offering definitive judgments. As Sebastian Domsch argues in his interesting essay about ways the internet transforms critical genres, criticism has typically attempted to be and sound “monologic,” as if “everything that needed to be said has been said and there are no more follow-up questions possible.”

One reason blogging aroused such hostility, I proposed, was that it exposed the artifice of this model, and indeed of any idea of literary criticism as a series of edicts issued from on high, leaving critics themselves exposed, not as frauds, but as less authoritative than they pretended to be. As Mendelsohn says in his review of Scott, “the advent of the Internet” has “rais[ed] still further questions about authority, expertise and professionalism”; I argued that it has done so by breaking down monologic forms and exposing the inherently dialogic nature of both critical judgments and critical authority. Domsch defines “critical authority” as “the level of acceptance that is conceded by a reader to an aesthetic value judgment”: I think he is right to emphasize that this kind of authority is not inherent in the speaker but conferred by context and audience. In my paper I drew on Wayne Booth’s notion of “coduction” to make the case for the importance of dialogue in developing critical judgments, and I pointed to blogging as a form that establishes “follow-up questions” as both a natural and an inevitable part of criticism.

MendelsohnBarbariansIf critical authority is not something you simply have but something you have to earn and maintain by your own participation in a dialogue — if it is best understood not so much as a top-down assertion of superiority (“the critic’s job,” Mendelsohn proposes in his recent review, “is to be more educated, articulate, stylish, and tasteful … than her readers have the time or inclination to be”) but as a process of establishing yourself as someone whose input into an ongoing conversation is sought and valued — that helps explain why “expertise” is such a tricky thing to define for a critic. Mendelsohn’s original formal training is as a classicist — despite his wide-ranging erudition and critical prestige, he would almost certainly not qualify for an academic position in any other field — but obviously he has written with considerable insight on a wide range of subjects, from Stendhal to Mad Men. That so many of us read Mendelsohn’s criticism with interest and attention no matter what he writes about is a sign that we have come to trust him, not as the last word on these subjects, but as someone who will have something interesting (“meaningful,” to use one of his key terms) to say about them. If we disagree with him, we are not challenging his authority but continuing the conversation — and in fact one thing I’ve been thinking a lot about is how little disagreement really matters to this kind of critical authority. If what we go to criticism for is a good conversation, then engaged disagreement can be seen as a sign of authority — a sign that you care enough about the critic’s perspective to tussle with it, if you like. I can think of a number of critics in venues from personal blogs to the New Yorker whose views I would not defer to, but which I want to know because they provoke me to keep thinking about my own readings — which (however definitive the rhetoric I too adopt in my more formal reviewing) I always understand to be provisional, statements of how something looked to me in that moment, knowing what I knew then, caring about what I cared about then.

boothcompanyI’m not saying we can’t or shouldn’t defend our critical assessments, but awash as we are and always have been in such a variety of them, it would be naively arrogant at best and solipsistic at worst to imagine ourselves as “getting it right,” no matter who we are or where we publish. Blogging very often reflects that open-endedness in its tone, and its form is based on just the process Booth describes as “coduction”:

‘Of the works of this general kind that I have experienced, comparing my experience with other more or less qualified observers, this one seems to me among the better (or weaker) ones, or the best (or worst). Here are my reasons.’ Every such statement implicitly calls for continuing conversation: ‘How does my coduction compare with yours?’

The comment box makes that implicit call explicit. This doesn’t mean “erudition, taste and authority” (the qualities Mendelsohn repeatedly invokes) don’t matter — though the extent to which they matter will depend on what you want from criticism. Domsch argues, for instance, that Amazon reviewing ultimately returns us to the most monologic form of criticism: people seek out, or are steered to (by algorithms, ‘like’ buttons and so on) the reviewer whose views and tastes are closest to their own, and once they find their “virtual” critical self, their critical proxy, as it were, they have found their perfect authority, a guarantor of their own well-established tastes. But Amazon is fundamentally about shopping. If you read criticism for some reason other than deciding which book to buy next, you are likely to look for and concede authority to different qualities. In my paper I noted that I don’t want to be told about books — I want to talk about books. So sympathetic as I am with most of what Mendelsohn says, I resist his insistence on the critic’s superiority as a necessary or structural part of the relationship.

The result of accepting, rather than resisting, the challenge blogging poses to old-fashioned critical forms is, I argued, not a catastrophically relativistic criticism of the kind Peter Stothard dreaded but a pluralistic criticism, such as that described by Carl Wilson in Let’s Talk About Love:

a more pluralistic criticism might put less stock in defending its choices and more in depicting its enjoyment, with all its messiness and private soul tremors — to show what it is like for me to like it, and invite you to compare. This kind of exchange takes place sometimes on the internet, and it would be fascinating to have more dialogic criticism: here is my story, what is yours?

I’d be very interested to know what you think about this argument, particularly about my proposal to redefine “critical authority” in a more reciprocal and context-dependent way than the anti-bloggers always do. What makes a critic “authoritative” to you? Or is “authority” not something you think or care about? If it isn’t, how would you explain what makes a critic someone you want to listen to or engage with? Are there critics you pay attention to because their taste (I might prefer the term “sensibility” myself) reflects yours, or because they push you to less familiar points of view? Does disagreeing with a critic make you doubt them, or does it depend on the critic, or the context? More generally, what do you want from criticism, and how do you think that affects where you read it and who you listen to?

The first picture here is one I took of the Big Four Bridge across the Ohio River from Louisville to the Indiana side. It was a really nice walk across and back!

This Week In My Classes: Reading Against the Grain

adambedeI have really enjoyed rereading Adam Bede for my graduate seminar over the past two weeks. Though I know the novel reasonably well, I have never spent the kind of dedicated time on it that I have on Middlemarch or The Mill on the Floss  — or, for that matter, on Romola. I’ve never even assigned it in an undergraduate class, I realize! Still, I do have a half-finished (well, maybe one-third-finished) essay on it for Open Letters that was (is?) going to focus on the line between explaining and justifying, between understanding and forgiving. This is a problem raised in most of Eliot’s novels, but Hetty’s infanticide is an extreme test case: there’s nothing abstract about the consequences of her crime, nothing diffuse or dispersed about the damage done, as there is with, for example, Bulstrode’s lies or Tito’s betrayal. “Children may be strangled, but deeds never,” says the narrator rather chillingly in Romola, but it’s really only in Adam Bede that there’s a literal child to mourn rather than an intangible (if irrevocable) fault.

Though the novel is called Adam Bede (a faintly puzzling choice that we talked about several times in class), Hetty is by far its most interesting element: both the drama of her story (especially the still-gripping-after-all-these-years journeys in hope and despair) and the meticulous care with which Eliot presents her vain, shallow, artless, and ultimately tragic character. Critics sometimes accuse Eliot of being hard on her beautiful women in general and on Hetty in particular. It’s true we’re shown Hetty in a very unflattering light, despite the emphasis on her kitten-like charms. That seems to me the only plausible option, though, if we are going to go through the moral exercise the novel sets for us of sympathizing with “more or less ugly, stupid, inconsistent people.” The point is not to help us see Hetty in a kindly light, to show us that she’s somehow better than she seems — but to show us that however irredeemably selfish she is, however incapable of self-reflection, nonetheless the onus is on us to “tolerate, pity, and love” her. Dinah, of course, is our model for that moral transcendence, and though she herself is rather a dull character, I think the meeting between the two women in prison is thrilling. (I wrote a little bit about it near the end of this essay on faith and fellowship in Middlemarch.)

So, there’s all that, and luxurious landscapes, and dramatic rescues, and Mrs. Poyser to boot — what’s not to love?

But I had much less fun rereading some of the critical articles I’d assigned, even though they are smart and well-argued and thought-provoking and all the things that they should be. I was trying to figure out why, and what I came up with was that in many ways they position themselves against George Eliot, against Adam Bede as she offers it to us. I’ve been reading and writing for so long now outside of academic parameters that I’ve become less accustomed to the “hermeneutics of suspicion,” or to readings that are less interested in the discussion the author is overtly having with us than in undermining or second-guessing or critiquing the terms of discussion the author has chosen. I would never argue that such critiques are illegitimate; often, too, they establish a valuable chiaroscuro in a robust appreciation (who today can love Dickens, for instance, without also conceding that his women often disappoint?). It would be naive, or worse, to pretend that there’s nothing objectionable to be found — even in George Eliot! (Yes, her politics are cautious to the point of conservative; yes, she’s essentialist about gender; yes, she can be less than rhapsodic about coarse peasants; etc.) I think that right now, though, for me it’s less rewarding to do or read criticism that digs in on these issues when there is so much that is progressive and aspirational, and also beautiful, in her writing. What are we to do with Adam Bede, after all, if we conclude that it perpetuates or advocates a vision (a version) of society that we reject? Close it and put it away for good?

Almost certainly not, of course, and I don’t think that’s what any of the critics we read are saying either. Usually (as I take it) the implicit subtext is something more like “read it in a more complicated way,” or “approach with skepticism.” Don’t, in other words, take Eliot’s words for granted, which is exactly the mantra I’ve been insisting on in my Introduction to Literature class — except that there, the purpose is not to catch out or undermine the author but to appreciate their artful use of language to serve their ends. That approach is consistent with ultimately finding those ends problematic, but it’s still overall a more positive exercise. (That seems both right and necessary as a first step: you can’t effectively critique what you don’t thoroughly understand, after all.)

Writing this, I am plagued by a sense that I’m being inconsistent, maybe even hypocritical. I definitely resist some books and read them, if not suspiciously, at least with something quite other than appreciation. I’ve also committed a lot of time and thought to the importance of ethical criticism, which is fundamentally about questioning the implications of an author’s literary strategies, as much as or more than it is about identifying their overt or covert political commitments. Maybe I still haven’t rightly identified the source of my annoyance, then — or maybe what it comes down to is just that I prefer my Adam Bede to the Adam Bede I saw in some of the critical essays. The miraculous thing about great books is that all these versions can coexist, that all these things can be going on at once. Love, too, can coexist with criticism — even my love for Middlemarch, which is complicated but not diminished by my anxiety that there is something potentially dangerous about its most beautiful moments.

Finished with Ferrante. Probably Forever.

lostchildI actually hadn’t intended to read The Story of the Lost Child. By the time I finished Those Who Leave and Those Who Stay, I felt that three long volumes of minutiae (however intense) and interpersonal angst (especially between two characters who never seemed either particularly plausible or particularly interesting) was plenty. It’s not that I didn’t think the first three Neapolitan novels were any good. They are good — probably better than most recent novels I’ve read. But after a point, it was impossible to read them without sky-high expectations, because their overall reception has been both so positive and so uncannily uniform. Raw! Honest! Confessional! Brave! (I wrote about the critical phenomenon already in some detail, in a piece that I thought might generate some self-conscious discussion among the feverish Ferrante fans or just people interested in the general issue of women’s writing and its reception. It didn’t.) And how many novels are really that great?

I got a tempting invitation to review the fourth volume, though, and so I did end up reading it. I’m not entirely sorry that the review has ultimately dead-ended, as during the editorial back-and-forth it was turning into something I didn’t really care for, that didn’t even sound like me. (That’s undoubtedly because it also didn’t start out very well, at least for its intended purpose: I’m not blaming anyone but myself.) I’m not entirely sorry I read The Story of the Lost Child either, though, because like its predecessors, it is pretty good, and after the investment of reading the first 1000 pages of a series, it is nice to know how it all wraps up. At this point, though, especially after two frustrating weeks immersed all over again in her work, I’m fed up with both thinking and writing about Ferrante. Anyone who wants to read a deep, thoughtful commentary about her should read Alice Brittan’s “Elena Ferrante and the Art of the Left Hand” in this month’s Open Letters. Alice loves the novels, but she also comes at them, as she comes at every book she writes about, from an unexpected angle, so though there’s plenty of enthusiasm on display, it’s not of the “these books are the awesomest, bravest, most honest, truthful, confessional, searing, epic portraits of women’s lives and female friendships ever” variety. (I’m sure not every other review is like that either, but that’s certainly the general flavor of Ferrante criticism.)

Here is the short version of my ‘take.’ The Neapolitan novels are good books, but to me they represent novels as blunt instruments. They have a lot of detail, but not a lot of nuance, especially stylistically. (Requisite caveat: maybe in the original Italian, they are different, better, more subtle.) In particular, the first-person narration is ultimately a disappointment, both artistically and thematically. Elena is not much of anything: she is neither unreliable nor interestingly retrospective (by which I mean, though she is remembering and reconstructing her past, her narration does not show her learning or developing from it). In the review you won’t ever read, I compare her unfavorably to Pip in Great Expectations (and why not, since every much-hyped novel these days seems to explicitly invite the comparison). Reading Great Expectations, you realize early on that Pip the character is not (until the end) Pip the narrator. There is great artistry in that palimpsestic effect, as well as real moral significance in his changing perspective. I did not find any comparable achievement by (either) Elena. As a Kunstlerroman, also, which is what the Neapolitan novels could (perhaps should) be, the series is unconvincing, or at least not compelling. Elena talks a lot about her writing, about its deficiencies and changes, and especially about women’s writing and women in writing as creatures of the male imagination and aesthetic. Her chronicle of her life, of Lila’s life, and of their friendship does not strike me as a powerful or empowering alternative: it’s too linear, too literal, and in its own ways, too reductive. If it is (as, say, Aurora Leigh is for Aurora Leigh) the culmination of her artistic development, then for me (despite all its emotional power, and the richness and complexity of its historical and sociological description) it’s underwhelming. (Maybe if I’d been this blunt in my draft review, we would have gotten somewhere!)

Lots of readers disagree with me, and plenty of critics have written at length about what they see as the brilliance of the series. Every major critical outlet (well, except one, I guess) has or will have an opinion on offer and I have yet to see one that isn’t pretty much ecstatic. So you have lots of support if you think I’ve read uncharitably or stupidly. My review, however, would have been mixed, for the reasons I’ve given. I found Nicola Griffith’s Hild a much more exciting literary experience: I’m really looking forward to reading its second volume. I’m keen, too, to read Adam Johnson’s new collection of short fiction, because I thought The Orphan Master’s Son was extraordinary. I will read anything else that Helen DeWitt publishes, because The Last Samurai was brilliant on every level. Having given Ferrante my best shot as a reader and critic, here and elsewhere, though, I think I’m done with her.

I wouldn’t even care — or bother saying anything — about this except that if you want (as I sometimes want, or think I want) to participate in ‘the literary world,’ the books everyone is talking about exert a certain pressure on you. (Recent exhibit A: The Goldfinch.) Sometimes, that’s fine: it’s a good book, it’s a good conversation, it’s a good intellectual exercise. Even when I write what I think is a really good piece of criticism about a current hot title, though (Life After Life, say — and there‘s a review I’m proud to have my name on), I often end up feeling a bit disappointed in the process. What (as Dorothea says) could be sadder than so much ardent labor all in vain? Because there’s always another, and another, and another good but probably not great book coming down the pipe that we’ll all feel we have to read and talk about.

The joy of blogging is the total freedom it brings from publishers’ schedules and publicists’ blandishments. I’m sure my current feelings of exasperation will abate, but you can probably expect a lot more Dorothy Dunnett around here for a while, until they do.

On Vacation!

I am in Vancouver enjoying some relaxing and sociable time with family and friends. As seems to be traditional, I have arrived in the middle of a heat wave! Happily, my parents have a lovely shady garden where we can shelter from the sun.

  

In the meantime, the July issue of Open Letters is live, so head on over for lots of good bookish reading, including my review of Kate Atkinson’s A God in Ruins. After much debate — internal but also with my wise co-editors — I decided to “spoil” the ending of the novel because my reaction to it was so specific I could not see how to have the discussion I wanted about the novel without going into details. So if spoilers are something that bother you and you haven’t read A God in Ruins but expect to, consider yourself warned.

There’s lots more to read in the new issue, including Anne Fernald on a recently released biography of Virginia Woolf, Steve Donoghue on a new translation of The Tale of Genji, Robert Minto reviewing the reviewers of a new biography of Saul Bellow, and our traditional summer reading feature, with lots of “cool” recommendations from the OLM team.

Enjoy, and Happy Canada Day!

From the Archives: Pondering the ‘Utilitarian’ Humanities

pigI’ve been thinking about this old post a lot lately because it’s hard to escape the discouraging conclusion that — despite having plenty of data on our side — humanists aren’t doing well convincing people that a humanities major is a perfectly practical choice. (I’m glad people are doing research on why better evidence against a pet theory actually makes people less likely to change their minds, because the problem seems pretty widespread these days.) And yet arguments for the intrinsic value and broader benefits of such studies, of the sort I gestured to here, also seem to be losing propositions, as if it is either an unaffordable luxury or self-indulgent navel-gazing to seek deep understanding of art, literature, philosophy, history, or any of the aspects of our rich and complex world that the humanities address.

Maybe it’s just media coverage that makes things seem so dire, but politicians (many of whom, of course, have liberal arts degrees themselves) seem relentlessly anti-intellectual these days, and they say and do what gets them votes, so that’s some kind of indicator of general trends. The comments thread on any story about higher education is also bound to be full of people decrying the waste of time the humanities are. And though students actually in our classes more often than not seem to find them plenty interesting and valuable, enrollments are falling.

Have we gone about this the wrong way? What else can we say or do except what we believe to be true about the subjects we study and teach? I really don’t know, but “don’t be a pig” remains a motto I think we should all seek to live by.*


Is Arguing for the Practical Utility of Literary Studies Ultimately Self-Defeating?

There’s a review of Louis Menand’s The Marketplace of Ideas up at Slate:

The Marketplace of Ideas is a diagnostic book, not a prescriptive one, and Menand’s proposals for how we might invigorate the academic production of knowledge are added as afterthoughts. He thinks we ought to shorten the length of study required for graduate students; the fact that it takes three years to get a law degree and close to a decade to get a humanities doctorate, he writes, is just another symptom of professors’ anxiety about the worth of their trade. We also ought to invite more applications from students who might not have self-selected as academic specialists. The notional aims of the academy—the lively and contentious production of new scholarship—would be better served by making academic boundaries more permeable rather than less.

But in the end, Menand’s proposals, smart and coherent though they are, seem less important than the case study provided by his career. He has managed to stay accountable at once to his colleagues in English departments and to his audience of general readers, and he has pulled this off without sacrificing either rigor or range. Menand is proof that an academic can be a great prose stylist, and that a journalist doesn’t have to be a dilettante—and that having a commitment to one community enriches one’s contribution to the other. He makes it hard to take seriously the rhetoric of crisis, and helps us get on with the important business of creating the problems of the future.

Reading it led me to look back at the excerpt from it published in Harvard Magazine last fall. I had a few ideas in response to it which I wrote about then. One of my remarks at that time was this, made in the context of the difficulty of defining a coherent curriculum when our discipline has become so undisciplined that there is really no way to justify doing one thing rather than another, and thus it becomes increasingly challenging to justify doing any of the things we do at all:

Too often, I think, we resort to a rhetoric of skills (critical thinking!) that (as Menand points out with his remark about the dubious efficiency of studying Joyce to achieve more general ends) rather strips away the point of working through literature to achieve such general, marketable ends.

I heard similar arguments being made again this week as we worked on setting up a “capstone” course for our honours students: in response to my observation that some proposed ingredients were designed to groom the students for graduate school in English (something about which I am currently filled with anxiety, thanks to the kinds of discussions underway here and here and here and here, not to mention these classics of the scarifying ‘just don’t go’ genre), I was reminded that good research and writing skills, as well as oral presentation skills, would benefit students in “law school or publishing or journalism or really any other jobs.” And don’t forget that we can teach them how to write a cv and a resume, and writing grant applications is not just for SSHRC but something you may have to do in many different contexts.

First of all, I totally agree. Research and writing and oral presentations are all excellent things to be good at, as are synthesizing a range of material and learning to build a strong evidence-based argument and proofreading and making a persuasive case for the value of a project you want other people to pay for and filling out forms and all the other transferable skills we know are part of what our students are learning and practising through their work in our classes.

That said, the more I think about it the more I wonder whether, in playing the game of “we’re useful too” we don’t actually end up rendering ourselves irrelevant by so happily setting aside the specificities of our work. Isn’t literary analysis (not to mention the extensive reading of, you know, literature, that it requires) a fairly roundabout route to those practical goals? If that’s what the students really want from us, we could save them a lot of time by not making them read so much Chaucer or Dickens or Joyce or Rushdie, that’s for sure. If we play the game that way, it seems to me we are bound to lose eventually. Yes, like writing, critical thinking requires content: “writing across the disciplines” makes sense because you need something to write about, and you can’t teach critical thinking unless you have something to think about either. But if you can learn to write anywhere, you can learn to think (and all the rest of it) anywhere too. Why English?

We need a pitch for ourselves that makes literature essential, but not in the self-replicating terms Menand rightly identifies as characteristic of professionalized literary studies (that is, by contributing to our profession according to existing norms and as judged by the profession itself, and the profession alone). We need to justify the study of literature for reasons literature alone can satisfy. We need to stand up, not for our methodology (doing so, after all, has meant warping that methodology to make it look as much as possible like some kind of science, or being so inscrutable that outsiders can’t tell what we’re doing anyway), but for the poems and novels and plays we take with us into the classroom every day. We need to be arguing, not that studying literature is just another way to do the same things every other discipline does (what university major won’t help you with critical thinking, research, writing, and presentation skills?), but that there are things–valuable things–about literature that you just can’t get any other way.

I’m thinking the way there is through aesthetics, on the one hand, and ethics on the other, and that the pitch should somehow involve a commitment to the importance of cultural memory and cultural critique, to character building and self-reflection, and to the needs as well as the ideals of civic society. If that sounds old-fashioned, I guess I don’t mind, though I’m not sure it needs to be.

millIn his account of utilitarianism, John Stuart Mill famously urges us away from too narrow a notion of the pleasures to be valued under his system:

Now it is an unquestionable fact that those who are equally acquainted with, and equally capable of appreciating and enjoying, both, do give a most marked preference to the manner of existence which employs their higher faculties. Few human creatures would consent to be changed into any of the lower animals, for a promise of the fullest allowance of a beast’s pleasures; no intelligent human being would consent to be a fool, no instructed person would be an ignoramus, no person of feeling and conscience would be selfish and base, even though they should be persuaded that the fool, the dunce, or the rascal is better satisfied with his lot than they are with theirs. They would not resign what they possess more than he for the most complete satisfaction of all the desires which they have in common with him. . . .

Whoever supposes that this preference takes place at a sacrifice of happiness — that the superior being, in anything like equal circumstances, is not happier than the inferior — confounds the two very different ideas, of happiness, and content. It is indisputable that the being whose capacities of enjoyment are low, has the greatest chance of having them fully satisfied; and a highly endowed being will always feel that any happiness which he can look for, as the world is constituted, is imperfect. But he can learn to bear its imperfections, if they are at all bearable; and they will not make him envy the being who is indeed unconscious of the imperfections, but only because he feels not at all the good which those imperfections qualify. It is better to be a human being dissatisfied than a pig satisfied; better to be Socrates dissatisfied than a fool satisfied. And if the fool, or the pig, are a different opinion, it is because they only know their own side of the question. The other party to the comparison knows both sides.

We should similarly urge our administrators away from too narrow an idea of the useful. Our motto could be, “Don’t be a pig.”

*All due respect to pigs, of course, whom we now know to be among the smartest (and cleanest) of our animal friends!

[Originally published January 20, 2010. In a follow-up post, I suggested that “The ‘Skills Argument’ Sounds Even Worse When We’re Talking About PhD’s in the Humanities.” That’s another set of concerns I still puzzle over a lot, as seen also in my 2011 post on “The PhD Conundrum.”]