Quotes & Commentary #60: Santayana

Quotes & Commentary #60: Santayana

We read nature as the English used to read Latin, pronouncing it like English, but understanding it very well.

—George Santayana

This simile about relation between human knowledge and material fact expresses a deep truth: to understand nature we must, so to speak, translate it into human terms.

All knowledge of the world must begin with sensations. All empirical knowledge derives, ultimately, from events we perceive with our five senses. But I think it is a mistake to confuse, as the phenomenalists do, these sensations for reality itself. To the contrary, I think that human experience is of a fundamentally different sort as material reality.

The relationship between my moving finger and the movement of the string I pluck is direct: cause-and effect. The relationship that holds between the vibrations in air caused by the guitar string, and the sound we perceive of the guitar, is, however, not so direct. For conscious sensations are not physical events. You cannot, even in principle, describe the subjective sensation of guitar music using physical terms, like acceleration, mass, charge, etc.

The brain represents the physical stimulus it receives, transforming it into a sensation, much like a composer represents human emotions using notes, harmonies, and rhythms—that is, arbitrarily. There is no essential relationship between sadness and a minor melody; they are only associated through culture and habit. Likewise, the conscious perception of guitar strings is only associated with the vibrations in the air through consistent representation: every time the brain hears a guitar, it creates the same subjective sensation. But the fact remains that the vibrations and the sensation, if they could be compared, would have nothing in common, just as sadness and minor melodies have nothing in common.

I must pause here to note a partial exception. In his Essay Concerning Human Understanding, John Locke notoriously makes the distinction between primary and secondary qualities. The latter are things like color, taste, smell, and sound, which are wholly subjective; the former are things like size, position, number, and shape: qualities that are inherent in the object and independent of the perceiving mind. Berkeley criticized this distinction; he thought that all reality was sensation, and thus there was no basis in distinguishing primary and secondary—both only exist in human experience. Kant, on the other hand, thought that reality in-itself could not, in principle, be described using any terms from human experience; and thus primary and secondary qualities were both wholly subjective.

Yet I persist in thinking that Locke was rather close to the truth. But the point must be qualified. As Einstein showed, our intuitive notions of speed, position, time, and size are only approximately correct at the human scale, and break down in situations of extreme speed or gravity. And we have had the same experience with regard to quantum physics, discovering that even our notion of location and number can be wholly inaccurate on the smallest of scales. Besides these physical consideration, any anthropologist will be full of anecdotes of cultures that conceive of space and time differently; and psychologists will note that our perception of position and shape differs markedly from that of a rat or a bat, for example.

All this being granted, I think that Locke was right in distinguishing primary from secondary qualities. Indeed, this is simply the difference between quantifiable and unquantifiable qualities. By this I mean that a person could give an abstract representation of the various sizes and locations of objects in a room; but no such abstract representation could be given of a scent. The very fact that our notions of these primary qualities could be proven wrong by physicists proves that they are categorically distinct. A person may occasionally make a mistake in identifying a color or a scent, but all of humanity could never be wrong in that way. Scientists cannot, in other words, show us what red “really looks like,” in the same way that scientists can and have shown us how space really behaves.

Nevertheless, we have discovered, through rigorous experiment and hypothesis, that even these apparently “primary qualities”—supposedly independent of the perceiving mind—are really crude notions that are only approximately correct on the scale of human life. This is no surprise. We evolved these capacities of perception to navigate the world, not to imagine black holes or understand electrons. Thus even our most accurate perceptions of the world are only quasi-correct; and there is no reason why another being, adapted to different circumstances, might represent and understand the same facts quite differently.

It seems clear from this description that our sensations have only an indicative truth, not a literal one. We can rely on our sensations to navigate the world, but that does not mean they show us the direct truth. The senses are poets, as Santayana said, and show us reality guised in allegory. We humans must use our senses, since that is all we have, but in the grand scheme of reality what can be seen, heard, or touched may be only a miniscule portion of what really exists—and, as scientists have discovered, that is actually the case.

To put these discoveries to one side for a moment, there are other compelling reasons to suspect that sensations are not open windows to reality. One obvious reason is that any sensation, if too intense, becomes simply pain. Pressure, light, sound, or heat, while all separate feelings at normal intensities, all become pain when intensified beyond the tolerance of our bodies. But does anybody suspect that all reality becomes literal pain when too severe? When intensified still further, sensation ceases altogether with death. Yet are we to suppose that the stimulus of the fatal blow ceases, too, when it becomes unperceivable?

Of course, nobody makes these mistakes except phenomenologists. And when combined with other everyday experiences—such as our ability to increase our range of sight using microscopes and telescopes, the ability of dogs to hear and smells things that humans cannot—then it becomes very clear that our sensations, far from having any cosmic privilege, represent only a limited portion of the reality, and do not represent the truth literally.

What we have discovered about the world, since the scientific revolution, only confirms this notion. Our senses were shaped by evolution to allow us to navigate in a certain environment. Thus, we can see only a small portion of the electromagnetic spectrum—a portion that strongly penetrates our atmosphere. Likewise with every other sense: it is calibrated to the sorts of intensities and stimuli that would aid us in our struggle to survive on the struggle of the earth.

There is nothing superstition, therefore, or even remarkable in believing that the building blocks of reality are invisible to human sensation. Molecules, atoms, protons, quarks—all of these are essential components of our best physical theories, and thus have as much warrant to be believed as the sun and stars. From a human scale, of course, there is a strong epistemological difference: they form components of physical theories; and these theories help us to make sense of experience, rather than constitute experience itself.

But that does not make them any less real. Indeed, our notion of an atom may be closer to nature than our visible image of an apple, since we know for sure that the actual apple is not, fundamentally, as it appears to human sight, while our idea of atoms may indeed give a literally accurate view of nature. Indeed, the view of sensations that I have put forward virtually demands that the truth of nature, whatever it is, be remote from human experience, since human experience is not a literal representation of reality.

This leads to some awkwardness. For if scientific truth is to be abstract—a theorem or an equation remote from daily reality—then what makes it any better than a religious belief? Isn’t what separates scientific knowledge from superstitious fancy the fact that the first is empirical while the latter is not?

But this difficulty is only apparent. Santayana aptly summarized the difference thus: “Mythical thinking has its roots in reality, but, like a plant, touches the ground only at one end. It stands unmoved and flowers wantonly into the air, transmuting into unexpected and richer forms the substances it sucks from the soil.” That is to say that, though religious ideas may take their building blocks from daily life, the final product—the religious dogma—is not fundamentally about daily life; it is a more like a poem that inspires our imaginations and may influence our lives, but is not literally borne out in lived experience.

A scientific theory, on the other hand, is borne out in this way: “Science is a bridge touching experience at both ends, over which practical thought may travel from act to act, from perception to perception.” Though a physical theory, for example, is itself something that is never itself perceived—we never “see” Einstein’s relativity in itself—using it leads to perceivable predictions, such as the deviation of a planet’s orbit. This is the basis of experiment and the essence of science itself. Indeed, I think that this is an essential quality of all valid human knowledge, scientific or not: that it is borne out in experience.

Like quantum physics, superstitious notions and supernatural doctrines all concern things that are, in principle, unperceivable; but the different is that, in quantum physics, the unperceivable elements predict perceivable events with rigid certainty. Superstitious notions, though in principle they have empirical results, are usually whimsical in their operation. The devil may appear or he may not, and the theory of demonic interference does not tell us when, how, or why—which gives it no explanatory value. Supernatural notions, such as about God or angels or heaven, are either reserved for another world, or their operation on this world are too entirely vague to be confirmed or falsified.

So long as the theory touches experience at both ends, so to speak, it is valid. The theory itself is not and cannot be tangible. The fact that our most accurate knowledge involves belief in unperceivable things, in other words, does not make it either metaphysical or supernatural. As Santayana said, “if belief in the existence of hidden parts and movements in nature be metaphysics, then the kitchen-maid is a metaphysician whenever she peels a potato.”

Richard Feynman made almost the same point when he observed that our notion of “inside” is really just a way of making sense of a succession of perceptions. We never actually perceive the “inside” of an apple, for example, since by slicing it all we do is create a new surface. This surface may, for all we know, pop into existence in that moment. But by imagining that there is an “inside” to the apple, unperceived by equally real, we make sense of an otherwise confusing sequence of perceptions. Scientific theories—and all valid knowledge in general—does essentially the same thing: it organizes our experience by positing an unperceived, and unperceivable, structure to reality.

Thus humanity’s attempt to understand nature is very accurately compared to an Englishman reading Latin with a London accent. Though we muddle the form of nature through our perception and our conception, by paying attention to the regularities of experience we may learn to understand nature quite well.

Quotes & Commentary #38: Emerson

Quotes & Commentary #38: Emerson

The good writer seems to be writing about himself, but has his eye always on that thread of the universe which runs through himself, and all things.

—Ralph Waldo Emerson

While there have been many great writers who never wrote of themselves—Shakespeare comes to mind, of whom we know very little—it is certainly true that Emerson wrote reams about his cosmic self. His greatest book is his diary, an exploration of self that rivals Montaigne’s essays in depth and eloquence.

Writing about oneself, even modestly—and Emerson was not modest—inevitably involves self-mythologization. Emerson was the Homer of himself. He looked ever inward, and in his soul he found deities more alluring than Athena and battles more violent than the Trojan War. From this tumultuous inner life he created for himself a persona, a literary character, who both incorporated and transcended Emerson the man.

Everyone does this, to a certain extent. Identity is slippery, and the self is a vanishing figment of thought. As Hume pointed out, we are really just a floating observer embroiled in bundles of sensations. Each moment we become a new person.

Our past only exists in our memory, which is just an internal rumor that we choose to believe. And our feeble sense of history, itself always in flux, is the only thing that ties together the confused mass of colors, sounds, and textures, the swirling indistinct thoughts, the shadowy images and daydreams, that make up our mental life.

Your identity, then, is more like the water flowing down a stream than anything solid. The self is a process.

This groundlessness, this ceaseless change, makes people uncomfortable. So much of our lives consists of building solid foundations for our insubstantial selves. Culture can be thought of as a response to this existential uncertainty; we constantly try to banish the ambiguity of identity by giving ourselves social roles, roles that tell us who were are in relation to everyone else, and who everyone else is in relation to us.

Each moment of the day carries its own ritual performance with its concomitant roles. In trains we become passengers, in cars we become commuters on our way to work, at work we become a job title, and at home we become a husband or wife.

The ritual of marriage, for example, is performed to impose an identity on you. But in order for this imposed identity to persist, the community must, in a million big ways and small, act out this new social role. Being married is a habit: a habit of acting, of thinking about yourself, and a habitual way of treating you that friends, family, acquaintances, and even the federal government pick up.

A common way of reinforcing one’s identity is to attach it to something apparently solid, objective, and permanent. Thus people learn to equate their self-esteem with success, love, money, with their marriage or their job title. But these strategies can backfire. Marriages fail and jobs end, leaving people feeling lost. And if you identify your worth with your fame, skill, or with the size of your wallet, you doom yourself to perpetual envy, since there will always be those above you.

People also position themselves demographically; they identify themselves with their age, nationality, ethnicity, race, or gender. These strategies have the merit of at least pointing to something substantial. I know, for example, that my behavior is influenced by the fact that I am an American; and by being cognizant this identity, I understand certain behaviors of mine.

Nevertheless this too can be taken too far, specifically when people reduce themselves to members of a group, and attribute all their behaviors to the groups to which they belong. Your demographic identity influences your behavior, by shaping the pattern of your actions and thoughts, but it does not comprise your identity, since identity can never be pinned down.

Those with strong wills and forceful personalities, like Emerson, wrestle with this problem somewhat differently: they create a personal mythology. This is a process by which they select moments from their past, and omit others, and by this selection create for themselves a story with a definite arc.

At the end of this arc is their persona, which is a kind of personal role, a character they invented themselves rather than adopted from society, formed by exaggerating certain qualities and downplaying others. This persona, unlike their actual, shifting identity, is stable and fixed; and by mentally identifying with this persona of theirs, they manage to push aside, for a time, the groundlessness of self.

I can’t help admiring these self-mythologizers, these artists of the literary self, the Emersons, Montaignes, and Nietzsches, who put themselves together through force of will. This procedure does carry with it some dangers, however, the most notable being the risk that you may outgrow or tire of your persona.

I can only speak from my own experience. Many times in my life I have acted out a sort of character in social situations, either from shyness of showing my real self, or an attempt to impress others; and although this strategy worked for a time, it ended by being highly unsatisfying.

In effect I trained those around me to respond to me in certain ways, to consider me in a certain light, and when I got tired of this character I was left with friends who didn’t know me. They knew a part of me, to be sure, since any character I can invent for myself will always have some of my qualities, but they didn’t know the full range of my traits.

Emerson was well aware of this danger, which is why he made it a point to be changeful and inconsistent. As he repeatedly said, he had no system. He considered himself an experimenter who played continually with new ideas. This itself was a sort of persona—the mercurial prophet, the spontaneous me—but it gave him the flexibility to expand and shift.

To me, there is nothing wrong with mythologizing yourself. The important thing is to recognize that your persona is not your self, and not to let a fixed conception of your own character constrain your actions. A personality is nothing but a pattern of behavior, and this pattern only exists in retrospect. You as you exist now are a bubble of awareness floating down a stream of sensation, a bubble that forms and reforms every passing moment.