Friday, June 23, 2006

Sometimes I Think I Am the Last Classical Humanist Left in America, if not the World.

I admit it. I want to rewind the clock to 1850, when there was some kind of consensus in Universities about what the “Great Books” were in each discipline. I want to sit in dusty libraries in Oxford, studying the Roman poets, Milton and Shakespeare. I want a high, leather-backed chair, a pipe, a tweed coat and mahogany furniture. Things are too confusing nowadays…

I want people in academia and society at large to come to some kind of agreement that we are all, in some sense, working together to make sense of the world. I want people to stop calling each other racists and sexists and promoting the idea that nobody can understand or relate to anybody else’s experience—the idea that there is no such thing as common Human experience (Love? Death? Youth? Aging? Fear?) and dismissing the past as something wrongheaded, dangerous and best forgotten.

I admit, too, that I believe that some books are better than others because they manage to encapsulate experiences and feelings that most people can relate to--even though these are rarely the most popular books, because the majority of the people don’t read much and therefore lack the skills necessary to make sense of complicated sentences, or are so preoccupied with their own lives that they cannot connect with the experiences of fictional characters whose lives, on the surface, seem different from their own.

The charge of “elitism” that has been leveled again and again at the traditional canons of literature is both valid and not so. In Maoist China, the People’s Army threw doctors out of hospitals and replaced them with ordinary, medically untrained citizens on the grounds that specialization was “elitism,” and that everybody is “good enough” to be a doctor. These new “doctors” performed operations: patients bled to death or died of infections.

It is certainly true that past canons of Western literature seriously underrepresented just about everybody except Anglo/Christian and Helleno/Pagan men. It is true also that they tended to favor certain forms and themes to the total exclusion of others. This has been changing, albeit slowly, and must continue to change. But it is not necessary to burn something to the ground in order to improve upon it. Shakespeare may be a “dead white man,” but his plays are only as dead as we are deadened to their language and meaning. Have we produced more powerful expressions of the emotions connected with young love than Romeo and Juliet? Have we managed to capture the alienation of old age better than King Lear does? Falstaff, Richard III, Hamlet—has anybody come up with more complex and living characters than these? Maybe. But cultural artifacts aren’t like computers—obsolete after four years and replaceable. The best ones hold their charge, because they say something we may once have known, but have forgotten, or have never heard said in quite that way before…

I grew up without much of a sense of history. I dropped out of AP World History because the book was just too dense. All I remember from High School Roman History is Caligula's "minnows." My seventh grade American History class, which was taught by the football coach, was a disaster: I spent the whole year occupied with trying (and failing) to memorize details like whether the “Shot Heard Around the World” was fired in Lexington or in Concord.

In fact, most of my reading over the years has been limited to fiction. Much of what I know about history comes from novels: I learned some of the history of Russia from Tolstoy, Gogol and Dostoevsky. From Dickens I learned a little about Victorian England. Faulkner, O’Connor, and McCullers showed me the American South, pre- and postwar. Most recently, I’ve been seeing the Old West through Cormac McCarthy’s eyes. Getting your history this way has its obvious disadvantages, in that some of the work of interpreting the past has been done for you, but it does teach me how people, wherever and whenever, shape and are shaped by external circumstances.

Increasingly, I’m becoming convinced that studying actual history is essential, too, if we don’t want to be blown around aimlessly forever in the winds of political and social fashion.

Even if the formal discipline of history has become the same kind of theoretical quagmire everything else is, it at least has the advantage of being in the past. Time has passed, people have thought and written about what any given historical event meant to the people who lived through it, and what new meanings it may have acquired over time. Historians may debate the causes leading up to the dropping of the A-bombs on Japan, but there is a finite and immutable sequence of events and pile of evidence there for us to interpret.

I understand that things get more complicated the more closely you look at them, and that just about everything has multiple causes and meanings. But causes and meanings are not infinite—they cannot be, because they are limited by the factors that give any thing its thing-ness. “Ulysses” will never contain one word more or less than it now contains. The trap into which I fear we have blithely pranced during my lifetime (though the foot was lifted maybe 10 years before I was born) is one in which, because everything is possible, nothing is possible. Because everything is equally valid, nothing has validity. People call each other names—speak from within mutually unintelligible frameworks.

This has recently manifested in the form of attacks on the media by both “Liberals” and “Conservatives.” Three times in the past two weeks I have heard somebody say that the New York Times is “the liberal media.” The Republican government has (unsuccessfully, thank God) tried this year to shape the content of National Public Radio’s programming to include more “conservative” messages, so that each program would be “balanced.” And every good liberal knows that Fox News can’t be trusted because it is run by Rupert Murdoch, an arch-conservative.

The bottom line of all this is the total erosion of the idea of “objective” reality. It is one thing to problematize objectivity—it is quite another to utterly annihilate it. The “trustworthy” source of information, in this climate, becomes the one that reinforces whatever subjective beliefs you already hold. Those who deny that the Holocaust ever happened in Germany have websites supporting and reinforcing their “position.” Those who, in the face of incontrovertible, carbon-dated evidence of the biological development of species, continue to insist that Adam and Eve populated the Earth 4000 years ago, are publicly defended by our president and by powerful lobbyists, who feel that “all sides of the issue” deserve a voice. There is no “issue.” Carbon-dating is not a matter of faith. By being “fair” to Creationism, you render science meaningless: simply another “choice” among many.

If the Great Libraries were to burn to the ground tomorrow, would we be hard pressed to decide between saving “The Canterbury Tales” or Stephen King’s “Pet Sematery”? Some might answer: “Yes. Absolutely. There is no empirical qualitative difference. The judgment that “The Canterbury Tales” is superior to “Pet Sematery” is the result of class-based aesthetic and moral tendencies that are relevant only to a small portion of the population.” But I would maintain that humanity would profit more by the preservation of “The Canterbury Tales” than by that of the Stephen King book, simply because the former is more densely packed with human nature and experience. “Pet Sematary” is scary and fun, but “The Canterbury Tales,” if all else were lost, could remind us of who we once were and re-teach us how to communicate.

I’m talking about criteria, both aesthetic and philosophical, for what constitutes badness, okayness, goodness and greatness in, well, in anything, but for now let’s stick with literature. I don’t pretend to believe that I am in a position to establish these criteria; they must represent the consensus of as many readers and writers as possible, and they must be flexible, so as to accommodate future development--so as not to become a kind of “Academie Francaise” of snooty culture police.

“Consensus” is the key word here: If we could begin with some kind of agreement that everything is not completely relative, the next logical step would be to try to outline what we do know. Then, at least, we could have a conversation...

No comments: