Quotes & Commentary #45: Montaigne

Quotes & Commentary #45: Montaigne

No pleasure has any savor for me without communication. Not even a merry thought comes to my mind without my being vexed at having produced it alone without anyone to offer it to.

—Michel de Montaigne, Essays

A few months ago I started an Instagram account, and since then something funny has happened.

Whenever I go out—whether its on a walk, a trip to a new city, or sometimes even when I’m alone in my room—a part of my brain keeps on the lookout for a good photo. The shimmering reflections in a pond, the crisscrossing angles and vanishing perspective of a cityscape, or any chance concatenation of color and light: I notice these, and do my best to photograph them when they appear.

Now it has even gotten to the point that, whenever I am taking a photo, I mentally plan how I will digitally edit it, compensating for its defects and accentuating its merits— sharpening some lines and blurring others, turning up the contrast and saturation.

I am not mentioning all this to call attention to my photography skills—which are purely amateurish in any case—but to a pervasive aspect of modern life: the public invasion of privacy.

I have learned to see the world with an imaginary audience by my side, a crowd of silent spectators that follows me around. When I see something beautiful, I do not only savor its beauty, but think to myself: “Wow, there are lots of other people who would find this beautiful, too! I better record it.”

This requires a significant cognitive shift. It means that, aside from my own tastes, I have internalized the tastes of others; or at least I have learned where my tastes and that of others tends to overlap. The consequences of this shift are equally significant. A beautiful dawn, the way a certain window catches the sun’s rays, the flowers in bloom on my walk to work—these are no longer silent joys, but public events.

(Incidentally, as I learn to see the world with the eyes of others, something else is taking place: I am learning to see the world through technology. By now I know when something is too far away to capture, or when the lighting will spoil the photo; and I know which defects I can edit and which ones I can’t. This means I have internalized, not only public tastes, but also technological limitations. My seeing is becoming ever-more mediated.)

It is customary to bemoan this development. I have done so myself, many times. Just today, as I walked through Retiro Park, I found myself thinking evil thoughts about all the people taking selfies. “Just stop posing and enjoy the beautiful day!” I thought, and began ruminating on the decline of modern culture.

And indeed, I do think it’s unhealthy, or at the very least in poor taste, to spend all your time on vacation taking photos of yourself and your friends, photos to be uploaded immediately for the benefit of all your other friends. Like anything, taking photos can be taken too far.

Nevertheless, I think it is a mistake to see this phenomenon—the public invasion of privacy—as fundamentally new. As Montaigne exemplifies, in a time long before Facebook, nearly everybody has the urge to share their pleasures with others. Social media is just a continuation of this. Before the internet, all of us publicized our private joys the old fashioned way: telling our friends and family. Adding pictures to Instagram and Snapchat is just an extension of the ancient art of telling anecdotes.

When I was on my trip to Rome, traveling alone, I visited St. Peter’s in the Vatican. It was such an impressive building that I wanted to go “Ooh” and “Aah,” to gush, to blubber in admiration, but I had no one to do this with. Alone, I had to keep my pleasures to myself; and far from being a neutral fact, I think this actually diminished my enjoyment of the experience. Unshared pleasures aren’t quite as sweet.

Why is this? A cynic would say that we share our pleasures as a form of bragging. “Look at the cool thing I’m doing! Bet your life isn’t as good as mine!”—this is what your vacation selfies say to your friends (and rivals) online. I do not deny that the cynic is partially right; bragging is an unavoidable part of social life. Who doesn’t like being envied?

This is the uglier side of the issue; and I think the bragging motivation—never openly said, but operative—is what drives people to take sharing too far. When people see themselves purely in the light of other people, in a giant popularity contest, then they fall prey to a cycle of envy and bragging. Naturally, everybody does their best to be envied; and since only public joys can be envied, somebody in the thrall of this mentality will neglect all purely private forms of pleasure.

Yet I do not think the cynic is totally right. We humans are a social species. Even the most introverted among us likes to spend time with others. And an extension of our urge to socialize is an urge to share our private selves.

Partially, we do this for validation: to have our judgments and perspective confirmed, to feel more ‘normal’ and less ‘strange’. If I think something is funny, it is a relief to know that others find it funny, too. Maybe this is a sign of weakness, a lack of self-confidence; but I think even the most confident among us feels a relief and joy when somebody confirms their own judgment.

Apart from validation, however, I think there is a simple joy in sharing. It feels good to make someone else smile or laugh; it even feels good to share a negative emotion. The feeling of being alone is one of the most painful feelings we experience, and yet loneliness so often creeps up on us. The feeling of connection—breaking through your own perspective and reaching another’s—is a natural joy, as inherently enjoyable as ice cream. Montaigne thought so; and this is also why, despite some misgivings, I enjoy using Instagram.

This is also why I write a blog rather than keep my scribbling to myself. Indeed, I find that, whenever I try to write something purely for myself, I can hardly write at all. The sentences come out mangled, the thoughts are confused, and the entire thing is mess. To do my best writing, I’ve found that I need a public (or at least a theoretical one). The knowledge that someone else might read my writing keeps me focused; it also makes writing far more fun.

Without a reader, writing feels entirely without consequence to me, and is consequently joyless. Montaigne apparently felt the same way, which is why, despite leading a fairly reclusive life in his castle tower, he published several editions of his essays.

It is vanity to seek fame. But is it vanity to wish to share joys and to connect with others, and to be understood by as many people as possible?

 

Quotes & Commentary #44: Montaigne

Quotes & Commentary #44: Montaigne

Who does not see that I have taken a road along which I shall go, without stopping and without effort, as long as there is ink and paper in the world?

—Michel de Montaigne

One thing above all attracts me to Montaigne: we both have an addiction to writing.

It is a rather ugly addiction. I personally find those who love the sound of their own voice nearly intolerable—and unfortunately I fall into this category, too—but to be addicted to writing is far, far worse: Not only to I love airing my opinions in conversation, but I think my views are so valuable that they should be shared with the world and preserved for future generations.

Why do I write so much? Why do I so enjoy running my fingers over a keyboard and seeing letters materialize on the screen? What mad impulse keeps me going at it, day after day, without goal and without end? And why do I think it’s a day wasted if I don’t have time to do my scribbling?

In his essay “Why I Write,” George Orwell famously answered these questions for himself. His first reason was “sheer egoism,” and this certainly applies to me, although I would define it a little differently. Orwell characterizes the egoism of writers as the desire “to seem clever, to be talked about, to be remembered after death,” and in general to live one’s own life rather than to live in the service of others.

I would call this motivation “vanity” rather than “egoism,” which is undeniably one of my motivations to write—especially the desire to seem clever, one of my uglier qualities. But this vanity is rather superficial; there is a deeper egoism at work.

Ever since I can remember, I have had the awareness, at times keen and painful, that the world of my senses, the world that I share with everyone else, is separate and distinct from the world in my head—my feelings, imagination, thoughts, my dreams and fantasies. The two world were intimately related, and communicated constantly, but there was still an insuperable barrier cutting off one from the other.

The problem with this was that my internal world was often far more interesting and beautiful to me than the world outside. Everyone around me seemed totally absorbed in things that were, to me, boring and insipid; and I was expected to show interest in these things too, which was frustrating. If only I could express the world in my head, I thought, and bring my internal world into the external world, then people would realize that the things they busy themselves with are silly and would occupy their time with the same things that fascinated me.

But how to externalize my internal world? This is a constant problem. Some of my sweetest childhood memories are of playing all by myself, with sticks, rocks, or action figures, in my room or my backyard, in a landscape of my own imagination. While alone, I could endow my senses with the power of my inner thoughts, and externalize my inner world for myself.

Yet to communicate my inner world to others, I needed to express it somehow. This led to my first artistic habit: drawing. I used to draw with the same avidity as I write now, filling up pages and pages with my sketches. I advanced from drawings of whales and dinosaurs, to medieval arms and armor, to modern weaponry. Eventually this gave way to another passion: video games.

Now, obviously, video games are not a means of self-expression; but I found them addicting nonetheless, and played them with a seriousness and dedication that alarms me in retrospect—so many hours gone!—because they were an escape. When you play a video game you enter another world, in many ways a more exciting and interesting world, a world of someone’s imagination. And you are allowed to contribute to this dream world—in part, at least—and adopt a new identity, in a world that abides by different rules.

Clearly, escapism and self-expression, even if they spring from the same motive, are incompatible; in the first you abandon your identity, and in the second you share it. For this reason, I couldn’t be satisfied for long with gaming. In high school I began learn guitar, to sing, and eventually to write my own songs. This satisfied me for a while; and to a certain extent it still does.

But music, for me, is primarily a conduit of emotion; and as I am not a primarily emotional person, I’ve always felt, even after writing my own songs, that the part of myself I wanted to express, the internal world I still wanted to externalize, was still getting mostly left behind. It was this that led me to my present addiction: writing.

I should pause here and note that I’m aware how egotistical and cliché this narrative seems. My internal world is almost entirely a reflection of the world around me—far, far less interesting than the world itself—and my brain, I’m sorry to say, is mostly full of inanities. I am in every way a product—a specifically male, middle-class, suburban product—of my time and place; and even my narrative about trying to express myself is itself a product of my environment. My feeling of being original is unoriginal. My life story is a stereotype.

I know all of this very well, and yet I cannot shake this persistent feeling that I have something I need to share with the world. More than share, I want to shape the world, to mold it, to make it more in accordance with myself. And my writing is how I do that. This is egoism in its purist form: the desire to remake the world in my image.

A blank page is a parallel world, and I am its God. I control what happens and when, how it looks, what are its values, how it ends, and everything else. This feeling of absolute control and complete self-expression is what is so intoxicating about writing, at least for me. Once you get a taste of it, you can’t stop. Montaigne couldn’t, at least: he kept on editing, polishing, revising, and expanding his essays until his death. And I suspect I’ll do the same, in my pitiful way, pottering about with nouns and verbs, eventually running out of new things to write about and so endlessly rehashing old ones, until I finally succumb to the tooth of time.

After mentioning the egoism of writers, Orwell goes on to mention three other motivations: aesthetic enthusiasm, historical impulse, and political purpose. But I think he leaves two things out: self-discovery and thinking.

Our thoughts are fugitive and vague, like shadows flickering on the wall, forever in motion, impossible to get hold of. And even when we do seem to come upon a complete, whole, well-formed thought, as often as not it pops like a soap bubble as soon as we stretch out our fingers to touch it. Whenever I try to think something over silently, without recording my thoughts, I almost inevitably find myself grasping at clouds. Instead of reaching a conclusion, I get swept off track, blown into strange waters, unable to remember even where I started.

Writing is how I take the fleeting vapors of my thoughts and solidify them into definite form. Unless I write down what I’m thinking, I can’t even be sure what I think. This is why I write these quotes and commentary; so far it has been a journey of self-investigation, probing myself to find out my opinions.

When I commit to write, it keeps me on a certain track. Unless you are like Montaigne and write wherever your thoughts take you, writing inevitably means sticking to a handful of subjects and proceeding in an orderly way from one to the other. Since I am recording my progress, and since I am committed to reaching the conclusion, this counteracts my tendency to get distracted or to go off topic, as I do when I think silently.

This essay is a case in point. Although these are things I have often talked and thought about, I had never fully articulated to myself the reasons why I write, or strung all my obsessions into a narrative with a unified motivation, as I did above, until I decided to write about them. No wonder I’m addicted.

Quotes & Commentary #43: Hemingway

Quotes & Commentary #43: Hemingway

So far, about morals, I only know that what is moral is what you feel good after and what is immoral is what you feel bad after.

—Ernest Hemingway

It is an essential part of the process of maturing, I think, to come to terms with your own emotions. Can they be trusted? How far? In what circumstances are they misleading? Do they make you act irrationally or do things you don’t normally do? Are you afraid of your emotions? Are you afraid of communicating them to others? Why? Do you tend you bury your emotions, or to ignore them? With what consequences? All these questions, and more, are unavoidable as we grow older and learn how to deal with ourselves.

It is worth pointing out the odd fact, taken for granted by nearly everyone, that our emotions are discussed as something essentially separate from ourselves. They are things that happen to us, things that strike us, things that affect us like a sickness. And yet they are ourselves, aren’t they? What could be more integrally a part of yourself than your feelings?

Perhaps we think of emotions are outside events, comparable to snowstorms or car accidents, because we recognize that they are universal experiences. Being angry, depressed, giddy, the feeling of being in love—the triggers of an emotion vary, but the experience itself binds us together. And in that way, the emotions can be said to be objective facts, not the most intimate part of ourselves, because they are the same for everyone.

Or perhaps we talk of our emotions as separate from ourselves because they come and go, sometimes at random, and are often beyond our control. Feeling melancholy on a lonely walk is like being caught in the rain—an event that depends on the whims of fate.

As someone who prides himself on being logical—although, heaven knows how silly I can be—my relationship with my emotions has always been rather skeptical, even suspicious. My friends in elementary school used to tease me for being robotic. As I grew up, I lost most of this robotic coldness, but some traces of it remain. I am still quite skeptical of emotions, and I still find my feelings to be suspicious.

In my experience at least, emotions cannot be trusted as sources of information. A classic example is walking out the door and feeling that you’ve forgotten something, or packing for a trip and feeling sure that there’s something your missing. In my case, this feeling is almost inevitably wrong; my feeling of worry or confidence have almost nothing to do with whether I have actually forgotten something.

It was a major discovery—which I only made in university—that my mood had very little to do with the things I normally hold responsible for it. Sometimes I would get angry and think about all the unpleasant things my friends did and said, all the inanities of my roommates, all the annoyances of my classes. Or I would get melancholy and think about things I missed from home, or convince myself that I was lonely and unloved, or castigate myself for being a failure.

And yet all of these things I blamed for my mood were totally irrelevant. Almost inevitably, if I sat down and ate something, or if I had a coffee and a candy bar, my mood improved dramatically. Indeed, after I drink coffee I am often ecstatically happy, and I think equally unrealistic thoughts about how great my life is.

Experiences like these reinforce my skepticism about feelings. I can feel sure I’ve forgotten something, even after checking three times. I can be enraged and curse the world and everyone in it, and yet this is only due to hunger. Feelings come and go, each one seeming to tell me something clear and definite, only to be replaced in the next moment by another feeling that tells me the exact opposite thing. Each one is convincing in its strength, and yet each is totally devoid of substance; they give me the feeling of certainty without any evidence to support it.

In Cognitive Behavioral Therapy, there is a form of cognitive distortion, erroneous thinking, called “emotional reasoning.” This consists precisely in trusting your emotions. Depressed people often feel ashamed, worthless, and hopeless, and then reason that these things must be true, since why else would they feel that way so persistently? Similarly, anxious people feel afraid, and believe that this fear is justified and is telling them about a real threat to their safety.

Indeed, it seems to me—or at least it’s been my experience—that getting over anxiety and depression involves learning to distrust your own feelings. I have learned, for example, that my feelings of fear often have nothing to do with something bad that might actually happen; and that my feelings of shame are not a reliable indicator of what other people will actually think.

To a certain extent, I think most people would agree, in theory at least, that emotions can be misleading. Nevertheless, there is one domain in which nearly everyone puts implicit faith in their emotions: morality.

I remember reading a book by Steven Pinker in which he demonstrated the emotional basis of our moral thinking in this way.

Consider this short situation: A family’s dog, who had lived with them for many years, was killed in front of their house by a car. The family heard that dog meat was delicious, so they cut up the dog and cooked him for dinner.

Now, in this situation, did the family act immorally? If you’re like me, you feel somewhat disgusted by this; and maybe you have decided that you’d never want to be friends with this family; and maybe you think it heartless that they could eat their loyal friend and companion. But did they do anything immoral?

I don’t think they did, because they didn’t hurt anyone or act out of accordance with the categorical imperative. And yet I admit that the first time I read this, I felt disgusted and almost outraged at this family. This illustrates Pinker’s general point: we have moral feelings first, and then try to rationalize them later. In other words, our moral reactions are not based on any logical standard but instead on gut feeling.

It is, of course, difficult to rationalize morality. Philosophers still struggle with it, and there are no easy answers. Be that as it may, this is no excuse to substitute feeling for thinking. Even a slight acquaintance with history shows that people have thought many things were terribly immoral—mixed-race marriages, or premarital sex—that nowadays don’t raise an eyebrow. The world is full of taboos and prohibitions that, to outsiders, don’t make any sense. We are capable of having strong moral reactions about activities that don’t harm anyone or pose any threat to society.

I do not know why people continue to trust their feelings of disgust and outrage when it has been shown again and again, even in my lifetime, that these feelings are often based on nothing at all. We trust our gut like it’s the Oracle of Delphi, handing out moral verdicts from on high; but out guts often disagree with one another, and just as often contradict themselves on different days.

Take the example of gay love. I remember when I was young, the idea of two men kissing was considered, by nearly everyone I knew, to be absolutely obscene; and now, we have a movie the features homoerotic love winning Best Picture. (I do not mean to suggest, even for a second, that we have overcome homophobia; but we have made progress.) The controversy surrounding trans people seems to be based on this same gut reaction of disgust. The “argument” about the “dangers” of trans people in public restrooms is so devoid of substance that I can only conclude it is feeble attempt to rationalize a feeling.

And yet I wouldn’t be surprised if, one day, being transgender was considered as unremarkable as gay love. I can see no logical reason to regulate, ban, or even worry about sexuality, gender, and orientation, because they don’t hurt anyone and don’t pose any threat to society. You may not like gay love, you may find the idea of trans people gross, and that’s fine, but this feeling is no valid indication that these things are wrong.

This brings me around to Hemingway’s quote. Hemingway said this in connection with bullfighting. He expected to find bullfighting disgusting, but he loved it, and for that reason didn’t think it was wrong. Well, it’s obvious by now that I don’t agree with this method of telling right from wrong. If bullfighting is right or wrong, we need to explain why, with reference to some standard.

My problem is that I normally think of morality as a relationship between humans, and I actually don’t know how to think about morality regarding animals. A bull cannot understand a duty, an obligation, or the idea of consequences; a bull can’t be reasoned with or convinced. All of these things are necessary, I think, for a creature to be a moral agent, to be bound and protected by a system of moral injunctions. So when we’re dealing with animals, can an action be right or wrong?

My gut feeling is that bullfighting is wrong, because it involves animal cruelty. But this feeling, however intense, is just that: a feeling. Can I rationally believe bullfighting is wrong while continuing to eat hamburgers? I really don’t know. Thus I am in the uncomfortable situation of having a dilemma for which my moral reasoning provides no solution; and this leaves me with nothing but a feeling. I suppose I’ll have to read and think some more about the subject.

Quotes & Commentary #42: Montaigne

Quotes & Commentary #42: Montaigne

Everything has a hundred parts and a hundred faces: I take one of them and sometimes just touch it with the tip of my tongue or my fingertips, and sometimes I pinch it to the bone. I jab into it, not as wide but as deep as I can; and I often prefer to catch it from some unusual angle.

—Michel de Montaigne

The pursuit of knowledge has this paradoxical quality: it demands perfection and yet continuously, inevitably, and endlessly fails in its goal.

Knowledge demands perfection because it is meant to be true, and truth is either perfect or nonexistent—or so we like to assume.

Normally, we think about truth like this: I make a statement, like “the cat is on the mat,” and this statement corresponds to something in reality—a real cat on a real mat. This correspondence must be perfect to be valid; whether the cat is standing on the side of the mat, or if the cat is up a tree, then the statement is equally false.

To formulate true statements—about the cosmos, about life, about humanity—this is the goal of scholarship. But can scholarship end? Can we get to a point at which we know everything and we can stop performing experiments and doing research? Can we reach the whole truth?

This would require scholars to create works that were both definitive—unquestioned in their authority—and exhaustive—covering the entire subject. What would this entail? Imagine a scholar writing about the Italian Renaissance, for example, who wants to write the perfect work, the book that totally and completely encapsulates its subject, rendering all additional work unnecessary.

This seems as if it should be theoretically possible, at least. The Italian Renaissance was a sequence of events—individuals born, paintings painted, sculptures sculpted, trips to the toilet, accidental deaths, broken hearts, drunken brawls, late-night conversations, outbreaks of the plague, political turmoil, marriage squabbles, and everything else, great and small, that occurred within a specific period of time and space. If our theoretical historian could write down each of these events, tracing their causation, allotting each its proportional space, neutrally treating each fact, then perhaps the subject could be definitively exhausted.

There are many obvious problems with this, of course. For one, we don’t have all the facts available, but only a highly selective, imperfect, and tentative record, a mere sliver of a fraction of the necessary evidence. Another is that, even if we did have all the facts, a work of this kind would be enormously long—in fact, as long as the Italian Renaissance itself. This alone makes the undertaking impossible. But this is also not what scholars are after.

A book that represented each fact neutrally, in chronological sequence, would not be an explanation, but a chronicle; it would recapitulate reality rather than probe beneath the surface; or rather, it would render all probing superfluous by representing the subject perfectly. It would be a mirror of reality rather than search for its fundamental form.

And yet our brains are not, and can never be, impartial mirrors of reality. We sift, sort, prod, search for regularities, test our assumptions, and in a thousand ways separate the important from the unimportant. Our attention is selective of necessity, not only because we have a limited mental capacity, but because some facts are much more necessary than others for our survival.

We have evolved, not as impartial observers of the world, but as actors in a contest of life. It makes no difference, evolutionarily speaking, if our senses represent “accurately” what is out there in the world; it is only important that they alert us to threats and allow us to locate food. There is reason to believe, therefore, that our senses cannot be literally trusted, since they are adapted to survival, not truth.

Survival is, of course, not the operative motivation in scholarship. More generally, some facts are more interesting than others. Some things are interesting simply in themselves—charms that strike the sight, or merits that win the soul—while others are interesting in that they seem to hold within themselves the reason for many other events.

A history of the Italian Renaissance that gave equal space to a blacksmith as to Pope Julius II, or equal space to a parish church as to the Sistine Chapel, would be unsatisfactory, not because it was inaccurate, but because its priorities would be in disarray. All intellectual work requires judgment. A historian’s accuracy might be unimpeachable, and yet his judgment so faulty as to render his work worthless.

We have just introduced two vague concepts into our search for knowledge: interest and judgment—interest being the “inherent” value of a fact, and judgment our faculty for discerning interest. Both of these are clearly subjective concepts. So instead of impartially represented reality, our thinkers experience reality through a distorted lens—the lens of our senses, further shaped by culture and upbringing—and from this blurry image of the world, select what portion of that distorted reality they deem important.

Their opinion of what is beautiful, what is meritorious, what is crucial and what is peripheral, will be based on criteria—either explicit or implicit—that are not reducible to the content itself. In other words, our thinkers will be importing value judgments into their investigation, judgments that will act as sieves, catching some material and letting the rest slip by.

Even more perilous, perhaps, than the selection of facts, will be the forging of generalizations. Since, with our little brains, we simply cannot represent reality in all its complexity, we resort to general statements. These are statements about the way things normally happen, or the characteristics that things of the same type normally have—statements that attempt to summarize a vast number of particulars within one abstract tendency.

All generalizations employ inductive reasoning, and thus are vulnerable to Hume’s critique of induction. A thousand instances of red apples is no proof that the next apple will also be red. And even if we accept that generalizations are always more or less true—true as a rule, with some inevitable exceptions—this leaves undefined how well the generalization fits the particulars. Is it true nine times out of ten, or only seven? How many apples out of a hundred are red? Finally, to make a generalization requires selecting one quality—say, the color of apples, rather than their size or shape—among many that the particulars possess, and is consequently always arbitrary.

More hazardous still is the act of interpretation. By interpretation, I mean deciding what something means. Now, in some intellectual endeavors, such as the hard sciences, interpretation is not strictly necessary; only falsifiable knowledge counts. Thus, in quantum mechanics, it is unimportant whether we interpret the equations according to the Copenhagen interpretation or the Many-Worlds interpretation—whether the wave-function collapses, or reality splits apart—since in any case the equations predict the right result. In other words, we aren’t required to scratch our heads and ask what the equations “mean” if they spit out the right number; this is one of the strengths of science.

But in other fields, like history, interpretation is unavoidable. The historian is dealing with human language, not to mention the vagaries of the human heart. This alone makes any sort of “objective” knowledge impossible in this realm. Interpretation deals with meaning; meaning only exists in experience; experience is always personal; and the personal is, by definition, subjective. Two scholars may differ as to the meaning of, say, a passage in a diplomat’s diary, and neither could prove the other was incorrect, although one might be able to show her interpretation was far more likely than her counterpart’s.

Let me stop and review the many pitfalls on our road to perfect knowledge of the Italian Renaissance. First, we begin with an imperfect record of information; then we must make selections from this imperfect record. This selection will be based on vague judgments of importance and interest—what things are worth knowing, which facts explain other facts. We will also try to make generalizations about these facts—generalizations that are always tentative, arbitrary, and hazardous, and which are accurate to an undetermined extent. After during all this, we must interpret: What does this mean? Why did this happen? What is the crucial factor, what is mere surface detail? And remember that, before we even start, we are depending on a severely limited perception of the world, and a perspective warped by innumerable prejudices. Is it any wonder that scholarship goes on infinitely?

At this point, I am feeling a bit like Montaigne, chasing my thoughts left and right, trying to weave disparate threads into a coherent whole, and wondering how I began this already overlong essay. Well, that’s not so bad, I suppose, since Montaigne is the reason I am writing here in the first place.

Montaigne was a skeptic; he did not believe in the possibility of objective knowledge. For him, the human mind was too shifting, the human understanding too weak, the human lifespan too short, to have any hope of reaching a final truth. Our reasoning is always embodied, he observed, and is thus subjected to our appetites, excitements, passions, and fits of lassitude—to all of the fancies, hobbyhorses, prejudices, and vanities of the human personality.

You might think, from the foregoing analysis, that I take a similar view. But I am not quite so cheerfully resigned to the impossibility of knowledge. It is impossible to find out the absolute truth (and even if we could, we couldn’t be sure when or if we did). Through science, however, we have developed a self-correcting methodology that allows us to approach ever-nearer to the truth, as evidenced by our increasing ability to manipulate the world around us through technology. To be sure, I am no worshiper of science, and I think science is fallible and limited to a certain domain. But total skepticism regarding science would, I think, by foolish and wrong-headed: science does what it’s supposed to do.

What about domains where the scientific method cannot be applied, like history? Well, here more skepticism is certainly warranted. Since so much interpretation is needed, and since the record is so imperfect, conclusions are always tenuous. Nevertheless, this is no excuse to be totally skeptical, or to regard all conclusions as equally valid. The historian must still make logically consistent arguments, and back up claims with evidence; their theories must still plausibly explain the available evidence, and their generalizations must fit the facts available. In other words, even if a historian’s thesis cannot be falsified, it must still conform to certain intellectual standards.

Unlike in science, however, interpretation does matter, and it matters a great deal. And since interpretation is always subjective, this makes it possible for two historians to propose substantially different explanations for the same evidence, and for both of their theories to be equally plausible. Indeed, in a heuristic field, like history, there will be as many valid perspectives as there are practitioners.

This brings us back to Montaigne again. Montaigne used his skepticism—his belief in the subjectivity of knowledge, in the embodied nature of knowing—to justify his sort of dilettantism. Since nobody really knows what they’re talking, why can’t Montaigne take a shot? This kind of perspective, so charming in Montaigne, can be dangerous, I think, if it leads one to abandon intellectual standards like evidence and argument, or if it leads to an undiscerning distrust in all conclusions.

Universal skepticism can potentially turn into a blank check for fundamentalism, since in the absence of definite knowledge you can believe whatever you want. Granted, this would never have happened to Montaigne, since he was wise enough to be skeptical of himself above all; but I think it can easily befall the less wise among us.

Nevertheless, if proper respect is paid to intellectual standards, and if skepticism is always turned against oneself as well as one’s peers, then I think dilettantism, in Montaigne’s formulation, is not only acceptable but admirable:

I might even have ventured to make a fundamental study if I did not know myself better. Scattering broadcast a word here, a word there, examples ripped from their contexts, unusual ones, with no plan and no promises, I am under no obligation to make a good job of it nor even to stick to the subject myself without varying it should it so please me; I can surrender to doubt and uncertainty and to my master-form, which is ignorance.

Nowadays it is impossible to be an expert in everything. To be well-educated requires that we be dilettantes, amateurs, whether we want to or not. This is not to be wholly regretted, for I think the earnest dilettante has a lot to contribute in the pursuit of knowledge.

Serious amateurs (to use an oxymoron) serve as intermediaries between the professionals of knowledge and the less interested lay public. They also serve as a kind of check on professional dogmatism. Because they have one tiptoe in the subject, and the rest of their body out of it, they are less likely to get swept away by a faddish idea or to conform to academic fashion. In other words, they are less vulnerable to groupthink, since they do not form a group.

I think serious amateurs might also make a positive contribution, at least in some subjects that require interpretation. Although the amateur likely has less access to information and lacks the resources to carry out original investigation, each amateur has a perspective, a perspective which may be highly original; or she may notice something previously unnoticed, which puts old material in a new light.

Although respect must be paid to expertise, and academic standards cannot be lightly ignored, it is also true that professionals do not have a monopoly on the truth—and for all the reasons we saw above, absolute truth is unattainable, anyway—so there will always be room for fresh perspectives and highly original thoughts.

Montaigne is the perfect example: a sloppy thinker, a disorganized writer, a total amateur, who was nonetheless the most important philosopher and man of letters of his time.

Quotes & Commentary #41: Emerson

Quotes & Commentary #41: Emerson

It is impossible to extricate oneself from the questions in which our age is involved. You can no more keep out of politics than out of the frost.

Ralph Waldo Emerson

I suppose Emerson wrote this at a time when insulation was far less robust, since nowadays it is perfectly possible to keep out of the frost. Or perhaps he meant that the only way to keep out of politics was to shut oneself up like a hermit. In any case, though the comparison has aged, the sentiment certainly has not.

Lately I have been reading some of Orwell’s essays, which called this quote to mind. It was Orwell’s essay on Dickens—in which he utters his famous dictum, “All art is propaganda”—that specifically made me think of Emerson.

Is it really true that all art contains political preaching? Is it true that it is impossible to extricate oneself from controversial questions?

To these answers I would give a qualified “yes.” I do not think it possible to create art that is politically neutral, since our thoughts about personality, philosophy, nationality, morality, sexuality, human nature, society, and so forth, all have political ramifications, even if the author has not thought through these ramifications. Every artistic choice—the characters, the setting, the plot—carries ideological baggage, even if this baggage is unintended.

Even if we attempted to create “art for art’s sake”—purely formal art, devoid of any identifiable content—this, too, would have political consequences, since it takes a stance, a very particular stance, on the role of art and the artist in society.

Before I go on, I will try to define what I mean by “politics.” To me, politics is the struggle between demographic groups for resources and power. Politics isn’t politics without controversy, since it necessarily involves a zero-sum game. This controversy is typically carried on in highly charged, stringently moral language; but the fundamental motive that animates political struggle is self-interest.

Where I disagree is that I think the political content of art is usually uninteresting, and plays little role in the art’s quality. There is no contradiction in saying that a great novel may embody backwards political principles, or noting that a movie that champions progressive values may be boring and amateurish.

Dante’s Divine Comedy, for example, it stuffed to the brim with the politics of his age; and these political passages are, in my opinion, inevitably the weakest and most tiresome of the poem. The Divine Comedy is great in spite of, not because of, its politics. Likewise, the political implications of Milton’s Paradise Lost are chiefly of historical interest, and for me neither add to nor subtract from the poem’s artistic force. Nothing ages faster than politics, since politics is always embroiled in transitory struggles between factions.

Orwell’s essay itself, ironically enough, also illustrates the limitations of seeing art as propaganda. Orwell attempts to treat Dickens as a sort of social philosopher—trying to furrow out Dickens views on the state, on the economy, on education, on the good life—only to repeatedly hit a wall. Dickens was not a reformer, and not a revolutionary; he was, if anything, a moralist, as Orwell himself admits. But above all Dickens was a novelist, something that should not need pointing out. His books are far more interesting as novels than as sermons.

Do not mistake my meaning. I am not arguing for “art for art’s sake.” Some art cannot be properly appreciated without noting its political message. I am, however, arguing that the political implications of a work of art do not exhaust its meaning, nor do they even constitute its most valuable meaning.

The aesthetic is as valid a category as the political and the moral. And by categorizing something as “art,” we implicitly acknowledge that its aesthetic qualities are its most important traits, and determine its ultimate value. Even if we insist otherwise, the very fact that we differentiate between polemical cartoons and portraits, between political pamphlets and plays, between national anthems and symphonies, belies the fact that we consider art a special category.

How anyone chooses to interpret a piece of art is, of course, up to them. Great art is distinguished by its ability to inspire nearly infinite reactions. But I do believe that every interpretation, if it wishes to respect the work in question, ought to increase our appreciation of it, or at least to try.

When somebody reads a novel solely for its political content, and then evaluates it solely on the extent to which it agrees or disagrees with the interpreter’s beliefs, the work of art is turned into a mere weapon of political struggle. In other words, to treat art merely as propaganda is not to respect it as art.

I have heard many movies and television shows denounced for their political implications; nowadays there are endless controversies about representation in media. Now, I believe the question of interpretation is undoubtedly important. But to condemn or champion works of art purely on this criterion is, I think, just as narrow-minded as ignoring the question of representation altogether.

Art speaks in different languages to different people. This is its magic and its lasting value. And anybody who thinks that they unequivocally “know” the meaning of a work of art, and is so politically self-righteous that they think they can pronounce eternal judgement on its worth, is acting tyrannical, even if they are mouthing sentiments of egalitarianism.

* * *

Parenthetically, it is worth noting that this anxiety about representation and political values in art, so common nowadays, is grounded in a certain, tacit theory of human behavior. This is the belief that our media exerts a decisive influence over our values and actions. Indeed, I’m sure this proposition would hardly be regarded as controversial in some circles.

But isn’t it equally possible that our media is just a reflection of our values and actions? And doesn’t the very fact that people are often politically dissatisfied with their media prove that we are not under their control? For, if the influence of media were decisive over our political perspective, how could we ever be dissatisfied with it?

Artists and art critics tend to be intellectuals. Intellectuals are naturally prone to believing that humans are motivated by ideas, since that’s what motivates intellectuals. Thus they can be expected to pay too much attention to art as a social force, and not enough to the other things that drive human behavior, like economic trends or political institutions, since art operates on the level of symbols. 

And since much of our discourse is framed by intellectuals—people tend to become politically conscious in college, under the influence of professors—it seems likely that paying too much attention to the political power of art would be a pervasive error, which I believe it is.

Quotes & Commentary #40: Ralph Waldo Emerson

Quotes & Commentary #40: Ralph Waldo Emerson

 

The test of civilization is the power of drawing the most benefit out of cities.

—Ralph Waldo Emerson

Cities are improbable. For most of our history we lived in little roving bands: Groups held together by personal relationships, of blood, marriage, or friendship, scattered lightly over the landscape, not tied to any particular spot but moving in accordance with their needs.

Agriculture changed that. You cannot raise crops without tending them throughout the year; thus you need a permanent settlement. Crops can also be grown and gathered more efficiently the more people there are to help; and a stable food supply can support a larger population. Cities grew up along with the crops, and a new type of communal living was born.

(I remember from my archaeology classes that early farmers were not necessarily healthier than their hunter-gatherer peers. Eating mostly corn is not very nutritious and is bad for your teeth. Depending on a single type of crop also makes you more sensitive to drought and at risk for starvation should the crops fail. This is not to mention the other danger of cities: Disease. Living in close proximity with others allows sickness to spread more easily. Nevertheless, our ancestors clearly saw some advantage to city life—maybe they had more kids to compensate for their reduced lifespan?—so cities sprung up and expanded.)

The transition must have been difficult, not least for the social strain. As cities grew, people could find themselves in the novel situation of living with somebody they didn’t know very well, or at all. For the vast majority of humankind’s history, this simply didn’t happen.

New problems must be faced when strangers start living together. In small groups, where everyone is either related or married to everyone else, crime is not a major problem. But in a city, full of strangers and neighbors, this changes: crime must be guarded against.

There is another novel problem. Hunter-gatherers can retreat from danger, but city dwellers cannot. And since urbanites accumulate more goods and food then their roving peers, they are more tempting targets for bandits. Roving nomads can swoop down upon the immobile city and carry off their grain, wine, and women. To prevent this, cities need defenses.

As you can see, the earliest denizens of cities faced many novel threats: crime from within, raids from without, and the constant danger of drought and starvation.

Government emerges from need to organize against these threats. To discourage crime the community must come together to punish wrongdoers; to protect against attacks the community must build walls and weapons, and fight alongside one another; to protect against starvation, surplus crops must be saved for the lean times.

Hierarchy of power, codes of law, and the special status of leaders arise to fill the vacuum of organization. Religion was also enlisted in this effort, sanctifying leaders with titles and myths, reinforcing the hierarchy with rituals and customs and taboos, and uniting the people under the guardianship of the same divine shepherds.

Despite these unpropitious beginnings, the city has grown from an experiment in communal living, held together by fear and necessity, into the generic model of modern life. And I have the good fortune to live near one of the greatest cities in history.

* * *

Whenever I am alone in New York City, I wander, for as many miles as time allows. The only way to see how massive, chaotic, and remarkable is New York, is by foot. There’s no telling what you might find.

I like to walk along the river, watching the freighters with their bright metal boxes of cargo, the leviathan cruise ships carrying their passengers out to sea, the helicopters buzzing overhead, giving a few lucky tourists a glimpse of the skyline. Bridges span the water—masses of metal and stone suspended by wire—and steam pours forth from the smokestacks of power plants.  

I pass through parks and neighborhoods. Elderly couples totter by on roller blades. A lonely teenager with a determined look practices shooting a basketball. The playground is full of screaming, running, jumping, hanging, falling, fighting kids. Their mothers and fathers chat on the sidelines, casting occasional nervous glances at their offspring.

Soon I get the United Nations building. The edifice itself is not beautiful—just a grey slab covered in glass—but what it represents is beautiful. The ideal of the United Nations is, after all, the same ideal of New York City. It is the ideal of all cities and of civilization itself: that we can put aside our differences and live together in peace.

The city is not just the product of political organization and economic means; it is an expression of confidence. You cannot justify building walls and houses without the belief that tomorrow will be as safe and prosperous as today. And you cannot live calmly among strangers—people who dress different, who speak a different language, people you have never seen before and may never see again—without trust.

It is that confidence in tomorrow and that trust in our neighbors on which civilization is built. And New York City, that buzzing, chaotic, thriving hive, is a manifestation of those values.

Quotes & Commentary #39: Emerson

Quotes & Commentary #39: Emerson

 

Each soul is a soul or an individual by virtue of its having or I may say being a power to translate the universe into some particular language of its own.

—Ralph Waldo Emerson

What does it mean for something to be subjective? This means that it depends upon a perspective to exist.

Pleasure and pain are subjective, for example, since they cannot exist independently of an observer; they must be felt to be real. Mt. Everest, on the other hand, exists objectively—or at least we think it does—since that hunk of rock and snow would persist even if there were no humans left to climb it and plant flags on its summit.

Humans, of course, can never get out of their own perspectives and know, for good and certain, that anything exists objectively. Thus “objective” facts are really inter-subjective; that is, they can be verified by other observers.

Contrary to common belief, facts cannot be verified purely through experience, since experience is always personal and therefore private. This is why we are justified in disbelieving mystic visions and reports of miracles.

Two things must happen for raw experience to be turned into objective knowledge.

First the experience must be communicated to another observer through language. Language is a bridge between observers, allowing them to compare, in symbolic form, the reality they perceive. Language is a highly imperfect bridge, to be sure, and much information is lost by turning our raw experience into symbols; nevertheless it is the best we have.

Second, another observer must try to have an experience that matches the experience of the first one. This verification is, again, constrained by the vagueness of language.

Somebody points and says “Look, a helicopter!” Their friend looks up into the sky and says “I see it too!” This correspondence of experience, communicated through words, is the basis for our notion of the objective world.

(There is, of course, the thorny Cartesian question: How can we know for certain that both the helicopter and our friend aren’t hallucinations? We can’t.)

Subjective and objective knowledge share this quality. Our knowledge of the external world—whether a fleeting sensation of a chilly breeze, or a scientific doctrine repeatedly checked—is always symbolic.

A symbol is an arbitrary representation. All words are symbols. The relationship between the word “tree” and actual trees is arbitrary; we could also say arbol or Baum and accomplish the same end. By saying that knowledge is symbolic, I mean that the relationship between the objective facts and our representation of those facts is arbitrary. 

First, the relationship between the external stimulus and our private sensation is an arbitrary one.

Light in itself is electromagnetic radiation. In other words, light in itself doesn’t look like anything; it only has an appearance when photosensitive eyes evolve. Our visual cortex represents the photons that strike our eyes as colors. There is only a symbolic connection between the objective radiation and the internal sensation. The red I experience is only a symbol of a certain wavelength of light that my eyes pick up.

As Santayana said, the senses are poets and only portray the external world as a singer communicates his love: in metaphors. This is the basis for the common observation that there is no way of knowing whether the red I experience is the same as the red you experience. Since the connection between the objective stimulus and the subjective representation is arbitrary, and since it is only me who can observe the result, we can never know for certain how colors look to other individuals.

When we communicate our experiences to others, we translate our direct experience, which is already a symbolic representation of the world, into still more general symbols. As I said above, much information is lost during this second translation. We can, for example, say that we’re seeing the color red, but we cannot say exactly what it looks like.

Modern science, not content with the vagueness of daily speech, uses a stricter language: mathematics. And it also uses a stricter method of confirmation: controlled experiments through which rival hypotheses are tested. Nevertheless, while stricter, scientific knowledge isn’t any less symbolic. To the contrary, modern physics is distinguished for being abstract in the extreme.

To call knowledge symbolic is not to discredit it; it is merely to acknowledge the arbitrariness of our representations of the natural world. Nature can be understood, but first we must translate her into our language. The truth can be spoken, but always with an accent.

Quotes & Commentary #38: Emerson

Quotes & Commentary #38: Emerson

The good writer seems to be writing about himself, but has his eye always on that thread of the universe which runs through himself, and all things.

—Ralph Waldo Emerson

While there have been many great writers who never wrote of themselves—Shakespeare comes to mind, of whom we know very little—it is certainly true that Emerson wrote reams about his cosmic self. His greatest book is his diary, an exploration of self that rivals Montaigne’s essays in depth and eloquence.

Writing about oneself, even modestly—and Emerson was not modest—inevitably involves self-mythologization. Emerson was the Homer of himself. He looked ever inward, and in his soul he found deities more alluring than Athena and battles more violent than the Trojan War. From this tumultuous inner life he created for himself a persona, a literary character, who both incorporated and transcended Emerson the man.

Everyone does this, to a certain extent. Identity is slippery, and the self is a vanishing figment of thought. As Hume pointed out, we are really just a floating observer embroiled in bundles of sensations. Each moment we become a new person.

Our past only exists in our memory, which is just an internal rumor that we choose to believe. And our feeble sense of history, itself always in flux, is the only thing that ties together the confused mass of colors, sounds, and textures, the swirling indistinct thoughts, the shadowy images and daydreams, that make up our mental life.

Your identity, then, is more like the water flowing down a stream than anything solid. The self is a process.

This groundlessness, this ceaseless change, makes people uncomfortable. So much of our lives consists of building solid foundations for our insubstantial selves. Culture can be thought of as a response to this existential uncertainty; we constantly try to banish the ambiguity of identity by giving ourselves social roles, roles that tell us who were are in relation to everyone else, and who everyone else is in relation to us.

Each moment of the day carries its own ritual performance with its concomitant roles. In trains we become passengers, in cars we become commuters on our way to work, at work we become a job title, and at home we become a husband or wife.

The ritual of marriage, for example, is performed to impose an identity on you. But in order for this imposed identity to persist, the community must, in a million big ways and small, act out this new social role. Being married is a habit: a habit of acting, of thinking about yourself, and a habitual way of treating you that friends, family, acquaintances, and even the federal government pick up.

A common way of reinforcing one’s identity is to attach it to something apparently solid, objective, and permanent. Thus people learn to equate their self-esteem with success, love, money, with their marriage or their job title. But these strategies can backfire. Marriages fail and jobs end, leaving people feeling lost. And if you identify your worth with your fame, skill, or with the size of your wallet, you doom yourself to perpetual envy, since there will always be those above you.

People also position themselves demographically; they identify themselves with their age, nationality, ethnicity, race, or gender. These strategies have the merit of at least pointing to something substantial. I know, for example, that my behavior is influenced by the fact that I am an American; and by being cognizant this identity, I understand certain behaviors of mine.

Nevertheless this too can be taken too far, specifically when people reduce themselves to members of a group, and attribute all their behaviors to the groups to which they belong. Your demographic identity influences your behavior, by shaping the pattern of your actions and thoughts, but it does not comprise your identity, since identity can never be pinned down.

Those with strong wills and forceful personalities, like Emerson, wrestle with this problem somewhat differently: they create a personal mythology. This is a process by which they select moments from their past, and omit others, and by this selection create for themselves a story with a definite arc.

At the end of this arc is their persona, which is a kind of personal role, a character they invented themselves rather than adopted from society, formed by exaggerating certain qualities and downplaying others. This persona, unlike their actual, shifting identity, is stable and fixed; and by mentally identifying with this persona of theirs, they manage to push aside, for a time, the groundlessness of self.

I can’t help admiring these self-mythologizers, these artists of the literary self, the Emersons, Montaignes, and Nietzsches, who put themselves together through force of will. This procedure does carry with it some dangers, however, the most notable being the risk that you may outgrow or tire of your persona.

I can only speak from my own experience. Many times in my life I have acted out a sort of character in social situations, either from shyness of showing my real self, or an attempt to impress others; and although this strategy worked for a time, it ended by being highly unsatisfying.

In effect I trained those around me to respond to me in certain ways, to consider me in a certain light, and when I got tired of this character I was left with friends who didn’t know me. They knew a part of me, to be sure, since any character I can invent for myself will always have some of my qualities, but they didn’t know the full range of my traits.

Emerson was well aware of this danger, which is why he made it a point to be changeful and inconsistent. As he repeatedly said, he had no system. He considered himself an experimenter who played continually with new ideas. This itself was a sort of persona—the mercurial prophet, the spontaneous me—but it gave him the flexibility to expand and shift.

To me, there is nothing wrong with mythologizing yourself. The important thing is to recognize that your persona is not your self, and not to let a fixed conception of your own character constrain your actions. A personality is nothing but a pattern of behavior, and this pattern only exists in retrospect. You as you exist now are a bubble of awareness floating down a stream of sensation, a bubble that forms and reforms every passing moment.

Quotes & Commentary #37: Emerson

Quotes & Commentary #37: Emerson

 

The years teach much which the days never know.

—Ralph Waldo Emerson

Time is curious. It is the most insubstantial of fabrics—impalpable, immaterial, invisible and ever elusive—and yet inescapably real.

Like a barren plain, time is flat and featureless.

The natural world gives time some structure. The turning of the earth about its axis gives us night and day, and the tilting of the earth’s axis gives us the seasons. The longest cycle of all is the earth’s journey around the sun, something that we are about to celebrate.

Humans, never content with nature, have added new landmarks to time’s endless uniformity. We name the seasons and divide them up into months; we name these months and divide them up into days; we name the days and give each day twenty-four hours; and each one of these hours carries with it a customary activity—eating, sleeping, working, playing.

Humans simply cannot abide the emptiness of time. We cannot tolerate time’s continuity, time’s running ever onward, forward, never pausing, never returning, time’s endless movement into the infinite future. All this makes us uncomfortable.

Thus we try to make time cyclical. As we are pushed along, we stick posts in the ground to mark our passing, and try to separate each one of these posts by the same stretch of ground. We go through the year marking the same periodic holidays and private anniversaries. The new year is our universal benchmark, the measuring post by which we all orient ourselves.

Since time is featureless, it is purely arbitrary where we place this marker. We could, if we wanted, chose to regard March 1st or September 27th as the New Year. And isn’t it absurd that we try to divide something continuous, like time, into discrete elements, like years? Isn’t it silly that we think 2016 transforms itself into 2017 in one instant, instead of gradually fading one to the other as time rolls along?

But we need structure. We need landmarks in the barren expanse to remind us how far we have gone, and how far we have to go. This is why New Year’s Eve is valuable: it gives us the chance to pause and reckon up what has come to pass, what was good, what was bad, and what we could do to preserve the good and shed the bad. It gives us the opportunity to delineate, however vaguely, the arc of our lives.

Things invisible day by day reveal themselves in this wider perspective. A mass of senseless trivialities, taken together, can reveal a design and purpose that lay buried under daily cares.

Life must be lived moment to moment; but these moments, so ordinary and unremarkable, can manifest stories of the deepest significance. These stories are our lives, viewed from afar, and we need to reread these stories every once in a while to keep ourselves on the right track. 

Have a happy near year, everyone.

Quotes & Commentary #36: Emerson

Quotes & Commentary #36: Emerson

I hate goodies. I hate goodness that preaches. Goodness that preaches undoes itself.

—Ralph Waldo Emerson

Certain stories snatched from daily life stick in the mind long after they have any practical relevance, because they seem to be the embodiment of a moral injunction. A friend of mine recently told me one such story.

This friend is a stand-up comedian, if not by trade, at least by dint of persistence, and he is constantly shuttling from venue to venue to hone his craft. One of these venues was run by a goodie. Now, the difference between a goodie and a good person is that the latter treats people with respect and kindness while the former makes a big show of his virtue.

This particular goodie was a straight white male who constantly advertised his progressive values. He decorated his venue with portraits of the black men killed by the police, and forbade all jokes with even a whiff of sexism.

All of this would possibly be admirable if this goodie were not constantly getting into conflicts with those around him. Like many goodies, he has a victim mentality, and blames all of these disagreements on the malevolence of his antagonists. When his venue was shut down by the fire marshal, for example, he attributed it to a conspiracy of right-wing comics.

Let me pause and remind the reader that all this is hearsay, so I cannot vouch for the accuracy of this information. I am told that during the uproar that followed the closure of his venue the goodie was publically accused of rape. The irony of a man who shrouds himself in feminism being a rapist is too palpable to pass unremarked. The goodie responded to the accusation, I am told, with a counter-accusation, saying that it was he who was raped.

Whether all or any of this information is actually true, it rings true. Throughout my life I have met many who have acted the part of a moralist and a crusader, preaching at every turn, tolerating no dissent, treating all disagreement as vile persecution. And these goodies, almost inevitably, are unpleasant to be around, even when I agree with their ideals.

There is a basic set of values—kindness, generosity, tact, respect, tolerance, and humility—that are fundamental to real goodness, that comprise the humdrum, everyday, unremarkable virtue that keeps society working. Goodies are not humble, not respectful, and not tolerant.

Goodies are nefarious because they sacrifice these basic values for supposedly higher ones—usually a political or a religious ideal. I say “nefarious” because, by failing to reach the level of everyday virtue while appearing to rise above it, they often manage to attract around themselves a little band of followers, who confirm one another in their zealotry. Thus goodieness spreads.

Now, there is, of course, nothing wrong with sticking up for your political or religious values, and preaching has its place. But remember that these ideals, whatever their appeal, whatever their justice, do not allow their adherents to forfeit the more basic virtues of everyday life.

Real virtue does not advertise itself; it is silently manifest in each phrase and action, an accompaniment of every word and deed. If a person repeatedly and insistently calls attention to their own goodness, suspect that this goodness is goodieness, and run for the hills.