Alas, alas, what misery to be wise when wisdom profits nothing!
Great books do not reveal themselves all at once. Old classics must be revisited from time to time, at different stages of life, in order to experience the many resonant frequencies of the work. This time around I chose to listen to these Theban plays as an audiobook, with a full cast; and it was far preferable to the mute page.
Reading, listening to, or watching the Greek plays may be the nearest we get to time travel. The works immerse us in a foreign world. What struck me most was the Greek attitude towards freedom and fate. Shakespearean tragedy is reliant on human choice. As A.C. Bradley notes, the tragedy is always specific to the individual, to the extent that the tragedy of one play would be impossible for the protagonist of another. Put Hamlet in Othello’s place, or vice versa, and he would make short work of the play’s problem. The tragedy in a Greek play is, by contrast, inevitable and universal. By the time that the curtain is raised in Oedipus Rex, he has long ago sealed his doom.
There is nothing special about Oedipus that marks him for a tragic fate. His tragedy could have befallen a Hamlet or an Othello just as readily as an Oedipus. This changes the entire emotional atmosphere. Whereas in a Shakespearean tragedy we feel a certain amount of dramatic tension as the protagonists attempt to avert crisis, in Greek tragedy there is instead a feeling of being swept along by an invisible, inexorable force—divine and mysterious. It is animated by a far more pessimistic philosophy: that honest, noble, and wise people who do nothing wrong can be dragged into the pit of misery by an inscrutable destiny.
As a result, the plays can sometimes engender a feeling of mystery or even of vague mysticism, as we consider ourselves to be the mere playthings of forces beyond all control and understanding. Characters rise to power in such a way that we credit their virtues for their success; and yet their precipitate fall shows that there are other forces at play. Life can certainly feel this way at times, as we are buffeted about, lifted up, and cast down in a way that seems little connected to our own actions. For this reason, I think that the fatalistic pessimism of these plays is both moving and, at times, even consoling.
Of the three, the most artistically perfect is Oedipus Rex, which Sophocles wrote at the height of his career. Antigone, the last play, was actually written first; and Oedipus at Colonus was written over thirty years, at the very end of Sophocles’ life.
Though arguably the worst of the three, Antigone is the most thematically interesting. It pits two ethical concepts against one another with intense force, specifically different sorts of loyalty. Is it better to be loyal to one’s family, to the gods, to the state, or to the ruler? Creon’s interdiction, though vengeful and petty, is understandable when one remembers that Polynices is a traitor responsible for an attack on his homeland that doubtless cost many citizens’ lives. Creon could have justified his decree as a discouragement of future disloyalty. Antigone believes that duty to family transcends the duty of a citizen, and the events justify this belief.
It must be admitted, however, that this ethical question is muddled by the religious nature of central issue. Few people nowadays can believe that burial rites are important enough to merit self-sacrifice and civil disobedience. When the superstitious element is removed, Antigone’s ethical superiority seems questionable at best. Certainly there are many cases when loyalty to the family can be distinctly unethical. If a sister sheltered a brother who just escaped imprisonment for murder, I think this would be an unequivocally immoral act. But since burial does not involve help or harm to anyone, the ethical question becomes largely symbolic—if no less interesting.
Even if the emotional import of these plays has been somewhat dulled by the passing years, they remain amazingly alive and direct. The power of these plays is such that, even now, when the Greek gods have passed into harmless myth, here we can still feel the sense of awe and terror in the face of a divine order that passes beyond understanding. It would take a long time for theater to again reach such heights.
It really is a nice theory. The only defect I think it has is probably common to all philosophical theories. It’s wrong. You may suspect me of proposing another theory in its place; but I hope not, because I’m sure it’s wrong too if it is a theory.
Like many other works of philosophy (and those of other subjects, for that matter), Naming and Necessity will likely be perplexing if you do not know what the author is arguing against. At the time that Kripke gave these lectures, the dominant theory in the philosophy of language was the Frege-Russell theory of reference. It is a rather elegant and simple theory, and you can look up Russell’s famous paper, “On Denoting,” or Quine’s “On What There Is,” online if you would like to know more about it. But I will explain it briefly.
Essentially, the idea is that names are shorthand descriptions. Thus, if you say “there’s a tiger over there!” you’re really saying something like “there is an x over there, such that x is feline, yellow-brown, black striped, quadrupedal, solitary, bigger than a human,” and so on. This way of analyzing names was, I believe, partly adopted because it carried no ontological commitment. It avoids confusing situations, like when you have to say “wizards don’t exist!”—for how could you name the things (wizards) that do not exist? That is paradoxical. On the Frege-Russell view, this awkwardness is avoided, since, when you assert that wizards do not exist, you are really saying “there is no x such that x is humanoid, magical, bearded, robed,” and so on. Thus, by specifying the criteria, lots of annoying existential questions can be side-stepped.
Nevertheless, I think that most people, when they first learn of this theory, feel a bit uncomfortable with it. The theory just is not intuitive. I do not think that anything analogous to Russell’s analyses are going on in my head when I hear “there’s a tiger over there!” In other words, I do not think of tigers as bundles of qualities or clusters of descriptions, but that the relationship of the name “tiger” to the living, breathing animals is much more straightforward. Kripke is essentially arguing that our intuition is correct. In fact, it is Kripke’s express point to uphold our intuitions regarding names:
Of course, some philosophers think that something’s having intuitive content is very inconclusive evidence in favor of it. I think it is very heavy evidence in favor of anything, myself. I really don’t know, in a way, what more conclusive evidence one can have about anything, ultimately speaking.
Seeing as Kripke is not fond of theories (as the opening quote shows) and is quite fond of intuition, this puts him into a bit of a pickle, for how is he supposed to argue against the theory? Thus, most of Kripke’s arguments rely on bizarre counterfactuals, which he expresses using the language of “possible worlds.” (I understood this as merely a way of speaking about hypothetical or counterfactual statements, rather than any metaphysical doctrine about possibility and parallel worlds; and this way of speaking, when understood as a figure of speech, does convey the essential point rather well.)
To explain Kripke’s argument, let me come up with a bizarre counterfactual of my own. Suppose that someone (presumably with far too much time and money on their hands, and with a questionable sensitivity to animal rights) decided to take some lions from Africa and introduce them into Asia. Then, suppose this person decided to shave the lions’ manes, to paint them yellow-brown, and then to paint black stripes on them, so as to look just like tigers. Suppose he is even such a genius animal trainer that he trains these lions to behave indistinguishably from tigers.
Now we return to the above example. If “there’s a tiger over there!” really meant “there is an x over there, such that x is feline, yellow-brown, black striped, quadrupedal, solitary, bigger than a human,” then the statement would be perfectly true, even if the person were pointing to the painted lions.
But it is not true. Lions and tigers are what could be called ‘natural types’; and natural types are distinguished by some essential quality, not by their total descriptions. Kripke is really reviving the old notion of essentialism: names pick out the object that possesses the essential property associated with that name. In the case of lions and tigers, I suppose the essential quality would be their genotypes. Thus, the essential property of a type of thing need not be the qualities by which we normally identify the thing. We normally identify lions and tigers by the way they look and act, but the above example shows that even those qualities are contingent; it is their respective essences (their genotypes in this case) which are the necessary qualities of tigers and lions.
This leads Kripke to disagree with another engrained philosophical idea (the second N of the title): that ‘necessary’ and ‘a priori‘ are synonyms. It was thought that only necessary truths could be known a priori, and only a priori truths were necessary. (In other words, you could only be certain about things you knew independently of experience.) Thus, “all bachelors are unmarried” is, in this view, a necessary truth, even if there are no bachelors at all, simply because that is the definition of ‘bachelor’; it is an analytic statement, true by definition, a mere tautology, and thus can be known a priori. This restriction of necessary statements to trivial tautologies was, I think, a way of fighting against obscure metaphysical arguments, such as the ontological argument for the existence of God.
Kripke, as I said, disagrees with this line of thinking. For Kripke, things can be known a priori that are not necessary, and things can be necessary and learned empirically (or a posteriori). The case of the genotypes of lions and tigers is a case in point; it took a long time to discover DNA, and to create the tools needed to investigate it in depth. DNA was, in other words, obviously learned of empirically. Nevertheless, it is a necessary truth that lions have the lion essence (genotype), and tigers have the tiger essence (genotype)—because if they did not they would not be lions and tigers. Necessary truths, then, need not be known a priori. (In other words, you can be certain about some things you learn from experience.)
The reverse distinction can also be made. If I pick up a certain stick, and say “I shall use this as the standard for my new measure, the schmeter,” I can know a priori that whatever length the stick is (in, say, inches or meters), it is exactly one schmeter. However, the exact length of a schmeter is contingent on the stick, and we can imagine situations in which the stick was longer or shorter, so the exact meaning of this a priori knowledge is contingent on some state of affairs. To sum up Kripke’s distinction: ‘necessary’ is a metaphysical term having to do with the essence of something, while ‘a priori‘ is an epistemological term having to do with how we come to know something.
As I hope you can see from my summary, Kripke’s arguments are meant to be intuitive; he rejects certain philosophical ideas by just pointing to situations in which they fail to properly apply. This, I think, is why Naming and Necessity is so well known: one need not master some technical apparatus, but merely think through the consequences of some hypothetical scenarios. Certainly, this is not a perfect book. Kripke is wordy and repetitive; this already short book could probably have been much shorter and crisper, or could have at least covered more territory. Still, Kripke was arguing against a whole paradigm; and paradigms do not go gentle into that good night.
When I finished this book, I was fairly convinced; but as subsequent conversations (in Wastrel’s comments, for example)* have shown me, there are some awfully strong counter-arguments. Philosophical questions are never so easily resolved. In particular, I am curious to see how Kripke proposes to deal with some of the situations which motivated the creation of the descriptive theory of names in the first place—for example, statements like “wizards aren’t real.” How can there be a causal connection with something that does not exist? And how can the name refer to a natural type of a fictitious object? After all, facts are easy to talk about; fiction is another thing entirely.
This book is difficult for me to review, mainly because there were so many parts of it that I did not fully understand. Quine is not writing for the general reader; he is writing for professional philosophers—a category that excludes people such as myself, who have not taken a single course in formal logic. Nevertheless, there are some parts of this book—particularly the first two essays, “On What There Is” and “Two Dogmas of Empiricism”—which can be understood by the persistent amateur.
I will try to explain what I think I know about Quine, subject to the very important caveat that these are the general impressions of somebody who is not an expert. I might easily be wrong.
Quine is an American, and so is very literal; he likes things he can touch, or at least can clearly define. This leads him to a kind of ontological puritanism: he wishes to admit as few types of entities into existence as possible. The most obvious token of this is his materialism. Quine thinks the world is fundamentally matter; thus, he rejects the existence of spirits, and, more surprisingly, of minds—at least minds as distinctly different metaphysical objects. (He is fine with keeping mentalistic terminology, so long as it is understood as paraphrases of behavioral phenomena.) This also prompts Quine to reject other, more banal, sorts of things like meanings and properties. In fact, Quine only acknowledges the existence of two sorts of things: physical objects, and sets (or classes). If I am not mistaken, Quine’s belief in something so abstract as a logical set is motivated by his famous indispensability argument—that we ought to believe in the types of things our theories of the world need.
Quine’s materialism is tied to two other -isms: holism and naturalism. By naturalism, I mean that Quine thinks that our knowledge comes from observation, from experience, from science; furthermore, that this is the only type of knowledge we have available. Quine would never attempt something like Descartes did, seeking to ground all of the contingent assertions of science with an unquestionable first principle (in Descartes’ case, this being that he thinks, and therefore is). Quine is even uncomfortable with doctrines such as Wittgenstein’s, which hold philosophy to be a sort of second-level activity, a discipline which tackles questions of a fundamentally different sort than those investigated by scientists. For Quine, there are no fundamentally different sorts of questions: all questions are questions about the natural world, and thus on identical epistemological and ontological footing. The only difference between philosophy and science, for Quine, is that philosophers ask more general questions.
Quine’s holism is, perhaps, the most interesting aspect of his views. The logical positivists thought that individual statements could be accepted or rejected based on our experiences. In other words, we make a statement about the physical world, and then go about trying to verify it with some experience. But Quine points out that this is far too simple an account. Our statements do not exist in isolation, but are tied to an entire web of beliefs—some very abstract and remote from any experience.
Keep this in your mind’s eye: a huge, floating hunk of miscellaneous trash, adrift in the ocean. Now, only some of this trash directly touches the ocean; these are the parts of our knowledge that directly ‘touch’ the experiential world. A great part of this trash, however, lies in the center of the mass, far away from the water; and this is analogous to our most abstract beliefs. If this gigantic trash island were to hit something—let us say, a big boat—two things could happen. The boat could be destroyed, and its wreckage simply added onto the floating trash island; or, the boat could tear its way through the trash island, changing its shape dramatically. These are, roughly, the two things that can happen when we face a novel experience: we can somehow assimilate it into our old beliefs, or we can reconfigure our whole web of beliefs to accommodate this new information.
I will drop the metaphor. What Quine is saying is that there are no beliefs of ours that cannot be revised—nothing is sacred. We have even considered revising our principles of logic, previously so unquestionable, in the face of quantum weirdness. There are also no experiences that could not, in principle, be explained away: we could cite hallucinations or mental illness or human error as the reason behind the anomalous experience.
Keeping Quine’s naturalism and holism in mind, it is pretty clear why he rejects the main tenets of logical positivism. First, Quine points out the vagueness of what philosophers mean when they talk about ‘analytic statements’. The classic case of an analytic statement is “all bachelors are unmarried,” which is true by definition: since a bachelor is defined as an unmarried man, it could not be otherwise that bachelors are unmarried. But note that this relies on the idea that ‘bachelor’ has the same ‘meaning’ as the phrase ‘unmarried man’. But what is a ‘meaning’? It sounds like a mental phenomenon; and because Quine does not hold minds to exist, he is very skeptical about ‘meanings’. So in what sense do ‘meanings’ exist? Can they be paraphrased into behavioral terminology? Quine does not exactly rule it out, but is rather dubious.
Quine’s holism is also at odds with the project of logical positivism. For, as already noted, logical positivists regard the meaning of a statement to be its verification; but Quine believes—and I think quite rightly—that statements do not exist in isolation, but rely on a whole background web of beliefs and doctrines. Here is a concrete example. Let us say we wanted to go out and verify the statement ‘flying saucers are real.’ We wander around with our camera, and then suddenly see a shiny disk floating through the air. We snap some photos, and pronounce our statement ‘verified’. But will people believe us? Scientists look at the object, and say that it is a weather balloon; psychologists examine us, and say that we are demented. The statement has thus not been verified at all by our experience; and even if we had better evidence of flying saucers than a few photographs, it is at least conceivable that we could go on finding alternative explanations—secret government aircraft, some mad scientist’s invention, an elaborate prank, etc.
I will stop trying to summarize his arguments here, because I feel like I am already in over my head. I will say, however, that Quine’s argument against logical positivism seems to rely on his own presumptions about knowledge and the world—which may, after all, be quite reasonable, but this still does not make for a conclusive argument. In short, Quine may be arguing against the dogmas of logical empiricism with dogmas of his own. I often had this experience while reading Quine: at first I would disagree; but then, after formulating my disagreement, I would realize I was only begging the question, and that we were starting with very different assumptions.
Quine is preoccupied with this idea of ontological commitment. He is exercised by his felt necessity of postulating the existence of things used in discourse, like meanings, mathematical objects and so forth. These are, no doubt, important questions; yet I do not find them terribly interesting to think about. In my experience, wondering about whether something ‘really exists’ often leads up dark intellectual alleys. When it comes to things like UFOs, the question is doubtless a vital one to ask; but when it comes to things like ‘sets’ and ‘meanings’, it does not excite me: for what would be the difference if sets ‘really existed’ or if they were just tools used in discourse with no existence outside of names and thought? I will leave these desert landscapes of logic for ones more verdant.
To conclude, Quine was obviously a brilliant man; he was, in fact, so brilliant, that I cannot understand how brilliant he was.
When I opened the pages of this book, I knew little about Alexander Hamilton aside from the fact that he wrote most of the Federalist Papers. But that man had a life indeed. I immediately found myself transfixed at a story that seemed more suited to fiction than to fact. No wonder that Hamilton’s life has been made the subject of a musical. (Unfortunately, from what I have heard of the music, it is not to my taste.)
Ron Chernow must have known that he had struck gold once he began research for this book. Hamilton’s story has all of the elements of a good Victorian novel: a poor and unfortunate upbringing (born an orphan out of wedlock); a good deal of bloodshed; an ever greater dose of scheming and argumentation; a tender love story and a sordid affair; and, to cap it off, an arch-rival who brings about a tragic end. Piloting through this maelstrom of adventure is the redoubtable Alexander Hamilton: clear-eyed, bright, industrious, at times imprudent and hasty, and always true to his own nature.
In short, I found this biography both extremely readable and a revealing portrait of the first years of this nation. Chernow is a flexible writer, capable of handling the pathos of melodrama, the intrigue of political scandal, the excitement of intellectual innovation, the frenzy of war, and the private moments of quiet intimacy. His primary strength is arguably his psychological insight. Unlike Robert Caro, who is a historian as much as a biographer, Chernow focuses in on the inner workings of his subject, letting us see history through the man’s eyes rather than the man ensconced in history.
Nevertheless, I do think that Chernow’s focus on psychology can lead him astray. At his worst, he is prone to a kind of cheap psychoanalyzing that I think adds very little to the subtance of this book. This was most in evidence in Chernow’s handling of Hamilton’s childhood on St. Croix. Chernow was quick to invoke this experience whenever he wished to explain Hamilton’s behavior. This is understandable, since it is arguably a biographer’s duty to make sense of their subject’s personality by tracing their experience; and Hamilton’s childhood was unique. However, the logic of psychoanalysis is so flexible as to be able to produce any conclusion you wish to wring from it.
Here is an example. We learn that Hamilton’s mother was accused of being a prostitute, and had children out of wedlock. She was abandoned by Hamilton’s father, cast out from polite society, and then died while Hamilton was quite young—penniless and alone. Now let us imagine that Hamilton, in adulthood, was scrupulously faithful to his wife and had a family life entirely free of scandal. The biographer could then say it was an intense desire to escape this childhood experience. Now let us imagine that Hamilton was a rake and constantly had affairs. The biographer could then say that he had a special sympathy for women on the outskirts of society. And so on. My point is that any subsequent behavior can be viewed as either a result of, or a reaction against, this childhood experience, which makes its use as an explanation extremely dubious.
This is my first critique of this book. My second is Chernow’s tendency to lionize his subject. It would be unfair to accuse him of writing a hagiography. Chernow is by no means blind to Hamilton’s faults. Still, one senses that Chernow for the most part puts a forgiving and generous interpretation on Hamilton’s actions, while casting the behavior of Hamilton’s foes—Adams and Jefferson, notably—in a far less tolerant light. As Chernow did in his biography of John D. Rockefeller, he is more eager to refute allegations against his subjects than to confirm them. In the hands of another biographer, I think that Hamilton could have come across as a less glorious figure.
In any case, Chernow has produced a well-researched biography that is both exhilarating and enlightening. It is a thoroughly fine book.
I think I understand what military fame is: to be killed on the field of battle and have your name misspelled in the newspapers.
—William Tecumseh Sherman
This documentary was long overdue. Aside from the basic overview, my knowledge of the American Civil War was embarrassingly sketchy; and I had also never seen anything by Ken Burns. Virtually everyone I know who has seen this documentary speaks about it in reverential tones. It lives up to the reputation. The eleven hours are packed with maps, dates, quotes, and most of all—stories. This is a history that focuses on individuals.
A documentary about a war that happened a century and a half ago, beyond all living memory, could easily have become dry and distant. But Ken Burns and his team overcome this obstacle through the dual use of photographs and quotes. The Ken Burns Effect has already entered common parlance, and you can see it displayed to great effect with these old photographs: the slow pan and zoom recreating, somewhat, the feel of watching a film. Combined with quotes of the men and women involved—soldiers, statesmen, generals, diarists—brought to life using voice actors, the watcher enters a bewitchingly immersive experience.
The war becomes, not merely troop movements on the screen, but an enormous catastrophe that our protagonists must live through. This gives the series an emotional force rare in documentaries. The horrors of war are the same as ever: seeing comrades fall, leaving children and widows behind, disease, malnutrition, homesickness, ghastly wounds, and the ever-present drudgery punctuated by moments of extreme terror. Some of the most disturbing images are of Yankee prisoners-of-war, totally emaciated through lack of food. Combined with this are the horrors of slavery, so central to the conflict, and the upheaval of the lives of so many civilians.
Virtually everything is well-done. McCullough brings both seriousness and sadness to the narration. The voice actors are uniformly convincing and effective. The music, too, goes a long way in recreating the mood and atmosphere of the times. Most of the guests were, however, rather unremarkable, with the notable exception of Shelby Foote, who was an endless trove of amusing and touching anecdotes. I can see how the documentary catapulted him to fame.
The series is not above criticism, however. Burns focuses most of his attention on the battlefield. This has the double benefit of being exciting and of avoiding the war’s most controversial issues. But I think the series should have delved far deeper into the causes of the war. I would also have appreciated far more about civilian life during wartime, rather than hearing mainly from soldiers and generals. Even Abraham Lincoln, though he makes his due appearances, is given far less space than a private in the Union Army. Such a wider scope would have made the documentary longer, more controversial, and perhaps more superficially boring; but as it stands the war’s immense political and historical significance is difficult to fathom from the documentary alone.
We are left with a rosy picture of the elderly veterans embracing on Gettysburg, with the war as a bad dream or even a glorious affair. Indeed, our species has been struggling to reconcile the heroic and the barbaric aspects of war since Homer wrote The Iliad. And it seems we still have not been able to face the horrors without including some shades of the bravery, the camaraderie, the brilliant strategy, to brighten up the picture. But the truth is that every war is a moral collapse, and this one was compounded by the taint of slavery. It is an extremely depressing picture, which may get somewhat obscured by the folksiness of this documentary.
This is one of those long overdue books that I should have read as a teenager. Reading it now was a curiously detached experience. I could not find what made the book such a universal classic, and put it down in much the same mood as I picked it up. This is not what a novel is supposed to do.
Golding’s book is a parable for the savagery lurking in the breast of humankind. It shows how a group of English children, when stripped of their usual environment, revert to barbarism and cruelty; and this is supposed to show how our civilization rests on a precarious foundation that may crumble at any moment.
Unfortunately, I found that Golding’s parable did not quite illustrate his intended moral. If a group of young children were stranded on an island for months on end, and finally found still alive with only three or four casualties, it would be considered a miracle and not a failure of civilization. Indeed, I found myself more often amazed at the children’s resilience and organization than at their failure to work together and resist their more primitive impulses.
Perhaps I was relatively unaffected by the violence because, like Golding, I have worked as a teacher. Quite apart from the angelic image of innocent childhood common in media, any teacher knows that children can be remarkably mean-spirited and even cruel—not to mention the other vices. I bet that Golding had a similar experience, and this must have been a major influence on the book. The only people I can imagine being shocked by this book are those who cling to an unrealistically rosy picture of our nature—in other words, people who have never worked as teachers.
All jokes aside, any contemporary reader will note the socially questionable assumptions underlying Golding’s portrayal of the boys’ descent into barbarism. For Golding, painting one’s face and chanting is savage; building shelters and fires is civilized. Yet to my mind, the “hunters” of this book displayed capabilities that are just as crucial to civilization as the supposedly civilized children. Rather than coming across as a sane man in a madhouse, Ralph seemed to be a rather ineffectual leader with no grasp of the importance of ritual, recreational, and aesthetic activities within a society.
So much for the philosophy. As a novel, I also thought that the book was lacking. None of the characters is finely drawn; they emerge from and then lapse into a kind of generalized boyhood. The dialogue was choppy and unconvincing. Golding’s writing is on sturdier ground in his narration and description, though for my part his prose could be a bit stiff.
To sum up, I cannot see why this book has become such a permanent fixture in our popular reading lists.
Reading this book was an illustration of the dangers of watching the movie first. I could not get Jack Nicholson out of my head, and heard all of the dialogue in his voice. This is, in part, a testament to the quality of the movie, which I think in many ways improved upon the book—both in plot and characterization. And though of course Kesey deserves credit for dreaming this whole thing up, I found his own version to be less compelling.
This is not an insult, however, since the movie is masterful and the book is almost as good. Both McMurphy and Nurse Ratched are iconic characters, and their clash is wonderfully realized. The list of strong secondary characters is too long to go through. As for plot, Kesey has managed to create a perfect parable for the countercultural narrative: that society cruelly forces people into conformity, and rebellious laughter and rule-breaking is the only way to stay sane and human.
All this being said, this is not simply a story about society in general. Now, I must preface these remarks by saying that I generally do not focus on issues of representation in novels. Not that representation is unimportant, but I think that literary merit is independent of social enlightenment. However, I think that the racism and misogyny in this book is so forward and so consistent that it cannot be passed over in silence. Indeed, I think that the issue of gender specifically was so strongly emphasized that it must have been an intentional choice on Kesey’s part, not an incidental attitude of an author from another time.
In short, all of the heroes of this book (aside from the narrator) are white men, and they are oppressed—in a bizarre mirror of real life—by black people and women. The narrator, Chief Bromden, fixates on the orderlies’ blackness, mentioning it at every point. They are the “black boys” with hands “big and black as a swamp” and faces of “slate.” They are rarely referred to by their names and never seen as fully human: just stupid soldiers for the hospital.
But I think that the misogyny runs deeper than the racism and is, indeed, one of the novel’s main themes. Kesey emphasizes it again and again. One of the most famous quotes from the book is: “Man, you lose your laugh you lose your footing.” But what is usually left out is what follows: “A man go around lettin’ a woman whup him down till he can’t laugh any more, and he loses one of the biggest edges he’s got on his side.”
Not to be too Freudian, but Nurse Ratched is the empodiment of the castration complex: a joyless, sexless woman intent on castrating the men. The idea of growing balls and having your balls taken away is repeatedly mentioned. In fact, one of the patients in the “disturbed” ward kills himself by cutting off his own testicles. When Nurse Ratched threatens to have McMurphy lobotomized, he jokes that she wants to cut off his nuts. And so on.
Nurse Ratched’s carefully concealed breasts are also one of the novel’s main metaphors: her attempt to be completely sexless is equivalent with her attempt to control the men and make them weak. McMurphy’s definitive revenge comes when he strips off the nurse’s uniform, exposing her breasts. She thus loses her power because “she could no longer conceal the fact that she was a woman.”
The drama of Billy Bibbit also falls into this pattern, and seems to indicate that, for Kesey, the proper relationship of men and women is for women to sleep with men, and that’s that. Bibbit is momentarily cured by finally getting laid (and the prostitutes are the only women portrayed positively) and is driven to desperation by the idea that his mother—another old, sexless woman—might find out. The entire reason that our hero, McMurphy, is committed in the first place is for statutory rape—a fact seen as heroic, not depraved.
Now, to repeat myself, this misogyny is so constant and so explicit that I do not think it is incidental to the book’s message. As Harding, the most articulate character, says: “We are victims of a matriarchy here, my friend, and the doctor is just as helpless against it as we are.” The whole story, then, becomes a kind of metaphor of the struggle of men to resist the enfeebling force of women: And social conformity itself is seen as primarily the doing of womankind.
I am unsure what to think about this. It is just possible that Kesey intended this as a kind of satire on misogyny, though the text did not read that way to me. In any case, despite this rather glaring theme, I still thought that the book was compelling. Kesey did, indeed, raise awareness for how psychiatric patients are mistreated. And the novel is undoubtedly a classic of the counterculture movement. The movie wisely toned down this prominent misogynistic aspect, which is yet another reason why I think it is superior.
Every valuing, even where it values positively, is a subjectivizing. It does not let things: be.
A Gentle Warning
In matters philosophical, it is wise to be skeptical of interpretations. An interpretation can be reasonable or unreasonable, interesting or uninteresting, compelling or uncompelling; but an interpretation, by its very nature, can never be false or true. Thus, we must be very careful when relying on secondary literature; for what is secondary literature but a collection of interpretations? Personally, I don’t like anybody to come between me and a philosopher. When a philosopher’s views are being explained to me, I feel as if I’m on the wrong end of a long game of telephone. Even if an interpreter is excellent—quoting extensively and making qualified assertions—his interpretation is, like all interpretations, an argument from authority; to interpret a text is to assert that one is an authority on the text, and thus should be believed.
Over generations, these interpretations can harden into dogmas; we are taught the “received interpretation” of a philosopher, and not the philosopher himself. This is dangerous; for, what makes a classic book classic, is that it can be read repeatedly—not just in one lifetime, but down the centuries—while continuing to yield new and interesting interpretations. In other words, a philosophical classic is a book that can be validly and compelling interpreted a huge number of ways. So if you subscribe to another person’s interpretation you are depriving the world of something invaluable: your own take on the matter.
In matters philosophical, I say that it is better to be stupid with one’s own stupidity, than smart with another’s smarts. To put the matter another way, to read a great book of philosophy is not, I think, like reading a science textbook; the goal is not simply to assimilate a certain body of knowledge, but to have a genuine encounter with the thinker. In this way, reading a great work of philosophy is much more like travelling someplace new: what matters is the experience of having been there, and not the snapshots you bring back from the trip. Even if you go someplace where you can’t speak the language, where you are continually baffled the whole time by strange customs and incomprehensible speech, it is more valuable than just sitting at home and reading guide books. So go and be baffled, I say!
This is all just a way of warning you not to take what I will say too seriously, for what I will offer is my own interpretation, my own guide-book, so to speak. I will make some assertions, but I’d like you to be very skeptical. After all, I’m just some dude on the internet.
An Attempt at a Way In
The best advice I’ve ever gotten in regard to Heidegger was in my previous job. My boss was a professor from Europe, a very well educated man, who naturally liked to talk about books with me. At around this time, I was reading Being and Time, and floundering. When I complained of the book’s difficulty, this is what he said:
“In the Anglophone tradition, they think of language as a tool for communication. But in the European tradition, they think of language as a tool to explore the world.” He said this last statement as he reached out his arm in front of him, as if grabbing at something far away, to make it clear what he meant.
Open one of Heidegger’s books, and you will be confronted with something strange. First is the language. He invents new words; and, more frustratingly, he uses old words in unfamiliar ways, often relying on obscure etymological connections and German puns. Even more frustrating is the way Heidegger does philosophy: he doesn’t make logical arguments, and he doesn’t give straightforward definitions for his terms. Why does he write like this? And how can a philosopher do philosophy without attempting to persuade the reader with arguments? You’re right to be skeptical; but, in this review, I will try to provide you with a way into Heidegger’s philosophy, so at least his compositional and intellectual decisions make sense, even if you disagree with them. Since Heidegger’s frustrating and exasperating language is extremely conspicuous, let us start there.
Imagine a continuum of attitudes towards language. On the far end, towards the left, is the scientific attitude. There, we find linguists talking of phonemes, morphemes, syntax; we find analytic philosophers talking about theories of meaning and reference. We see sentences being diagrammed; we hear researchers making logical arguments. Now, follow me to the middle of this continuum. Here is where most speech takes place. Here, language is totally transparent. We don’t think about it, we simply use it in our day to day lives. We argue, we order pizzas, we make excuses to our bosses, we tell jokes; and sometimes we write book reviews. Then, we get to the other end of the spectrum. This is the place where lyric poetry resides. Language is not here being used to catalogue knowledge, nor is it transparent; here, in fact, language is somehow mysterious, foreign, strange: we hear familiar words used in unfamiliar ways; rules of syntax and semantics are broken here; nothing is as it seems.
Now, what if I ask you, what attitude gets to the real essence, the real fundamentals of language? If you’re like me, you’d say the first attitude: the scientific attitude. It seems commonsensical to think that you understand language more deeply the more you rigorously study it; and one studies language by setting up abstract categories, such as ‘syntax’ and ‘phoneme’. But this is where Heidegger is in fundamental disagreement; for Heidegger believes that poetry reveals the essence of language. In his words: “Language itself is poetry in the essential sense.”
But isn’t this odd? Isn’t poetry a second or third level phenomenon? Doesn’t poetry presuppose the usual use of language, which itself presupposes the factual underpinning of language investigated by science? In trying to understand why Heidegger might think this, we are led to his conception of truth.
If you are like me, you have a commonsense understanding of what makes a statement true or false. A statement is “true” if it corresponds to something in reality; if I say “the glass is on the table,” it is only true if the glass really is on the table. Heidegger thinks this is entirely wrong; and in place of this conception of truth, Heidegger proposes the Greek word “aletheia,” which he defines as “unconcealment,” or “letting things reveal themselves as themselves.”
It’s hard to describe what this means abstractly, so let me give you an example. Let’s say you are a peasant, and a rich nobleman just invited you to his house. You get lost, and wander into a room. It is filled with strange objects that you’ve never seen before. You pick something up from a table. You hold it in your hands, entranced by the strange shape, the odd colors, the weird noises it omits. You are totally lost in contemplation of the object, when suddenly the nobleman waltzes into the room and says “Oh, I see you’ve found my watch.” According to Heidegger, what the nobleman just did was to cover up the watch in a kind of veneer of obviousness. It is simply a watch, he says, just one among many of its kind, and therefore obvious. The peasant, meanwhile, was experiencing the object as an object, and letting it reveal itself to him.
This kind of patina of familiarity is, for Heidegger, what prevents us from engaging in serious thinking. This is why Heidegger spends so much time talking about the dangers of conformity, and also why he is ambivalent about the scientific project: for what is science but the attempt to make what is not obvious, obvious? To bring the unfamiliar into the realm of familiarity? Heidegger thinks that this feeling of unfamiliarity is, on the contrary, the really valuable thing; and this is why Heidegger talks about moods—such as anxiety, which, he says, discloses the “Nothing.” Now, it is a favorite criticism of some philosophers to dismiss Heidegger as foolish by treating “Nothing” as something; but this misses his point. When Heidegger is talking of anxiety as the mood that discloses the “Nothing” to us, he means that our mood of anxiety is the subrational realization of the bizarreness of existence. That is, our anxiety is the way that the question faces us: “Why is there something rather than nothing?”
This leads us quite naturally to Heidegger’s most emblematic question, the question of Being: what does it mean to be? Heidegger contends that this question has been lost to history. But has it? Philosophers have been discussing metaphysics for millennia. We have idealism, materialism, monism, monadism—aren’t these answers to the question of Being? No, Heidegger says, and for the following reason. When one asserts, for example, that everything is matter, one is asserting that everything is, at base, one type of thing. But the question of Being cannot be answered by pointing to a specific type of being; so we can’t answer the question, “what does it mean to be?” by saying “everything is mind,” or “everything is matter,” since that misses the point. What does it mean to be at all?
So now we have to circle back to Heidegger’s conception of truth. If you are operating with the commonsense idea of truth as correspondence, you will quite naturally say: “The question of ‘Being’ is meaningless; ‘Being’ is the most empty of categories; you can’t give any further analysis to what it ‘means’ to exist.” In terms of correspondence, this is quite true; for how can any statement correspond with the answer to that question? A statement can only correspond to a state of affairs; it cannot correspond to the “stateness” of affairs: that’s meaningless. However, if you are thinking of truth along Heidegger’s lines, the question becomes more sensible; for what Heidegger is really asking is “How can we have an original encounter with Being? How can I experience what it means to exist? How can I let the truth of existence open itself up to me?”
To do this, Heidegger attempts to peel back the layers of familiarity that, he feels, prevents this genuine encounter from happening. He tries to strip away our most basic commonsense notions: true vs. false, subject vs. object, opinion vs. fact, and virtually any other you can name. In so doing, Heidegger tries to come up with ways of speaking that do not presuppose these categories. So in struggling through his works, you are undergoing a kind of therapy to rid yourself of your preconceptions, in order to look at the world anew. In his words: “What is strange in the thinking of Being is its simplicity. Precisely this keeps us from it. For we look for thinking—which has its world-historical prestige under the name “philosophy”—in the form of the unusual, which is accessible only to initiates.”
What on earth are we to make of all this? Is this philosophy or mystical poetry? Is it nonsense? That’s a tough question. If by “philosophy” we mean the examination of certain traditional questions, such as those of metaphysics and epistemology, then it might be fair to say that Heidegger wasn’t a philosopher—at least, not exactly. But if by “philosophy” we mean thinking for the sake of thinking, then Heidegger is a consummate philosopher; for, in a sense, this is the point of his whole project: to get us to question everything we take for granted, and to rethink the world with fresh minds.
So should we accept Heidegger’s philosophy? Should we believe him? And what does it even mean to “believe” somebody who purposely doesn’t make assertions or construct arguments? Is this acceptable in a thinker? Well, I can’t speak for you, but I don’t accept his picture of the world. To sum up my disagreement with Heidegger as pithily as possible, I disagree with him when he says: “Ontology is only possible as phenomenology.” On the contrary, I do not think that ontology necessarily has anything to do with phenomenology; in other words, I don’t think that our experiences of the world necessarily disclose the world in a fundamental way. For example, Heidegger thinks that everyday sounds are more basic than abstract acoustical signals, and he argues this position like so:
We never really first perceive a throng of sensations, e.g., tones and noises, in the appearance of things—as this thing-concept alleges; rather we hear the storm whistling in the chimney, we hear the three-motored plane, we hear the Mercedes in immediate distinction from the Volkswagen. Much closer to us than all sensations are the things themselves. We hear the door shut in the house and never hear acoustical sensations or even mere sounds. In order to hear a bare sound we have to listen away from things, divert our ear from them, i.e., listen abstractly.
To Heidegger, the very fact that we perceive sounds this way implies that this is more fundamental. But I cannot accept this. Hearing “first” the door shut is only a fact of our perception; it does not tell us anything about how our brains process auditory signals, nor what sound is, for that matter. This is why I am a firm believer in science, because it seems that the universe doesn’t give up its secrets lightly, but must be probed and prodded! When we leave nature to reveal itself to us, we aren’t left with much.
And it was clear that I’m not a Heideggerian from my introduction. As the opening quote shows, he was partly remonstrating against our dichotomy of subjective opinion vs. objective fact; whereas this notion is the very one I began my review with. You’ve been hoodwinked from the start, dear reader; for by acknowledging that this is just one opinion among many, you have, willingly or unwillingly, disagreed with Heidegger.
So was reading Heidegger a waste of time for me? If I disagree with him on almost everything, what did I gain from reading him? Well, for one thing, as a phenomenologist pure and simple, Heidegger is excellent; he gets to the bottom of our experience of the world in a way way few thinkers can. What’s more, even if we reject his ontology, many of Heidegger’s points are interesting as pure cultural criticism; by digging down deep into many of our preconceptions, Heidegger manages to reveal some major biases and assumptions we make in our daily lives. But the most valuable part of Heidegger is that he makes you think: agree or disagree, if you decide he is a loony or a genius, he will make you think, and that is invaluable.
So, to bring this review around to this volume, I warmly push it into your hands. Here is an excellent introduction to the work and thought of an original mind—much less imposing than Being and Time. I must confess that I was pummeled by Heidegger’s first book—I was beaten senseless. This book was, by contrast, often pleasant reading. It seems that Heidegger jettisoned a lot of his jargon later in life; he even occasionally comes close to being lucid and graceful. I especially admire “The Origin of the Work of Art.” I think it’s easily one of the greatest reflections on art that I’ve had the good fortune to read.
I think it’s only fair to give Heidegger the last word:
… if man is to find his way once again into the nearness of Being he must first learn to exist in the nameless. In the same way he must recognize the seductions of the public realm as well as the impotence of the private. Before he speaks man must first let himself be claimed again by Being, taking the risk that under this claim he will seldom have much to say.