Quotes & Commentary #69: Keynes

Quotes & Commentary #69: Keynes

It is astonishing what foolish things one can temporarily believe if one thinks too long alone, particularly in economics (along with the other moral sciences), where it is often impossible to bring one’s ideas to a conclusive test either formal or experimental.

—John Maynard Keynes

I have been thinking a lot about Keynes lately, and not only because I am reading a massive biography of his life. Keynes is one of those perennial thinkers whom we can never seem to escape. He exerted enormous influence during his lifetime and dominated economic thought and policy for thirty years after his death. Then, as inevitably happened, the Keynesian orthodoxy became too successful for its own good. His ideas came to be taken for granted, and his innovations became the conventional wisdom that the cleverest economists of the next generation came to reject. This ushered in the age of Neoliberalism—with Margeret Thatcher, Ronald Reagen, and Milton Friedman as the great standard-bearers—and the decline in Keynesian thought.

And yet, whenever there is a serious problem with the economy, everyone instinctively returns to Keynes. It was he who most convincingly analyzed the sources of economic recession and depression, and then plotted a way out of it. He was writing, after all, in the wake of the Great Depression.

To oversimplify the basic idea of Keynes’s analysis, it is this: High unemployment leads to a lack of demand, and a lack of demand can push financial systems beyond the breaking point. Put another way, the economy can be envisioned as an enormously complex machine that is composed of millions of cogs. Some cogs are small, some are large, and all are connected—either proximally or distantly. If one small cog stops working, then it may cause some local disturbances, but the whole machine can continue to chug along. But if too many cogs fail at the same time, the machine can come to a grinding halt.

As the coronavirus shuts down huge sections of the economy, this is exactly the scenario we are facing. Waiters, bartenders, actors, musicians, taxi drivers, factory workers—so many people face lay-offs and unemployment as businesses prepare to shut down. Besides this, if we are locked into our homes, then there are now far fewer places where people can spend their money, even if they have money to spend. It is inevitable that some people will not be able to afford rent, that some businesses will go under, and that much of the money that is available to circulate will remain unused in bank accounts. People are not going to be buying houses, or cars, or dogs, or much of anything in the coming weeks (besides toilet paper, of course).

Now, in a capitalist economy, anyone’s problem is also my problem, since buying and spending are so intimately related. The money you spend eventually becomes the money I receive, and vice versa. Thus, if there is a increase in unemployment (limiting the money you receive), an increase in bankruptcies (limiting the money the banks receive), and a decrease in spending (limiting the money I receive), then we have a recipe for serious economic contraction. A wave of bankruptcies inevitably puts pressure on banks; and if banks begin to collapse, then we are in grave trouble. Whether or not we like to admit it, banks provide an essential service in the economy, one which we all rely on. To return to my crude cog analogy, the banks are some of the biggest cogs of all; and if they stop turning, nothing else can move.

Keynes’s solution to this dilemma was essentially to use the government’s almost limitless ability to borrow money, and inject as much cash into the economy as possible. In other words, the idea is to stimulate demand, so that people can continue to spend money. It is an idea that has been criticized by so-called ‘responsible’ people for generations. Can the government really afford to go into so much debt during a recession? Can such artificial measures actually prop up an ailing economy? Can we tolerate such a huge degree of government involvement in a liberal society?

Republicans—and to a lesser extent, even Democrats—have been sharply critical of Keynesian economics over the years. When Obama wanted a stimulus package for the 2008 financial crisis, he faced endless opposition and criticism from the Republican party. And now that we are facing an economic crisis on a comparable scale, the Republicans are turning without hesitation to Keynes: hundreds of billions in stimulus, and even resorting to mailing checks to every American. One could hardly imagine a more straightforwardly Keynesian solution than this. Keynes had this to say about how the government could deal with a recession:

If the Treasury were to fill old bottles with banknotes, bury them at suitable depths in disused coalmines which are then filled up to the surface with town rubbish, and leave it to private enterprise on well-tried principles of laissez-faire to dig the notes up again (the right to do so being obtained, of course, by tendering for leases of the note-bearing territory), there need be no more unemployment and, with the help of the repercussions the real income of the community, and its capital wealth also, would probably become a good deal greater than it actually is. It would, indeed, be more sensible to build houses and the like; but if there are political and practical difficulties in the way of this, the above would be better than nothing.

This is the closest that Keynes got to the notion of simply giving people money. Paying people for absolutely useless work is better than nothing, since at least then people are being paid; and if they are being paid, they can spend their money; and if they spend their money, I can get paid; and so on. If this were a different kind of crisis—a kind where we did not have to practice social distancing—then perhaps we could imagine large-scale infrastructure projects as a way of combating recession. But now, we must resort to the even more radical idea of paying Americans to do nothing. Maybe Andrew Yang’s notion of a universal basic income is not so far after all?

Well, here is where I must warn my readers (all three of you) that I am really quite clueless when it comes to economics, so everything written here must be read in that spirit of ignorance. However, I think that Keynes’s quote is also quite relevant for non-economic reasons. As so often true in economics, we are facing an entirely novel situation. This is a crisis without precedent, and that means that all of our ideas of how to cope with the crisis are untested. The closest historical precedent to the coronavirus is the 1918 flu pandemic; and yet there are important differences between both the disease and the historical situation. We are thus operating without ‘conclusive tests,’ in Keynes’s words, of our ideas. It remains to be seen which country’s approach will be the wisest.

In the meantime, Keynes is an example for us to follow: an intellectual who responded to a historical crisis with both ingenuity and rigor. Let us hope there are many more like him.

Quotes & Commentary #65: Proust

Quotes & Commentary #65: Proust

In his younger days a man dreams of possessing the heart of the woman whom he loves; later, the feeling that he possesses the heart of a woman may be enough to make him fall in love with her.

—Proust

When it comes to love, artists can be usefully divided between romantics and cynics.

The former see love as something unambiguously wonderful, whose presence produces absolute joy and whose absence the most profound misery—something endlessly interesting to contemplate and unquestionably good. The cynical attitude—far less common—sees love as a kind of illusion, a self-hypnosis, which promises everything and delivers nothing. Proust was of the latter camp, whose solipsism admitted no possibility of genuine human connection.

I admit that I normally incline towards Proust’s view. I have never much liked love songs or love poetry, and most love stories leave me cold. Indeed, I wish that we did not dedicate so much of our art to romance. Human life is so rich in potential themes, and yet our art circles endlessly around the standard tropes of romantic love: the pain of rejection, the agony of desire, the triumph of success, the pangs of jealousy, and so on. The conventions are so well-established that it strikes me as artistically lazy to go through the same motions: first, because little originality is needed; second, because love, being an innate human desire, is something that people will respond to automatically, so little skill is needed to hold the audience’s attention.

I would even go farther, and assert that any art which relies so exclusively on these instinctual urges is a form of pornography. This is what I mean. However artfully pornography is directed and acted, it is still a lower form of art than non-pornographic films, since to be effective it only needs to appeal to a fundamental human desire. Food photography is arguably in the same class: even when well-done, its main appeal is to the stomach and not the mind. Extremely sentimental art, by appealing to the basic human desire for intimacy, falls into this category as well.

This is not to say that art should not involve emotions; that would be absurd. But true aesthetic appeal, for me, is always disinterested, involving the absence of desire. Thus any art which appeals directly to desires—and, like pornography, is really just the satisfaction of a desire in fantasy—falls short of real aesthetic appeal. Much love-themed art is clearly based on this fantasy satisfaction.

I do not wish to be dogmatic about this. Undeniably there are poems, plays, songs, and novels of the finest quality about love. My contention is only that these superlative works sublimize love into an aesthetic sensation, a pure appreciation of emotion devoid of pain or excitement. The ultimate example of this is Dante, who took his erotic passion and made it the primary element of his sweeping vision of the universe. But this, of course, is no easy thing. Rather I think it requires the greatest artistry to create excellent works devoted to love, since the artist must resist at every point the temptation to give into fantasy.

If Dante is the ultimate artist of the positive potential of love—describing God as “The Love that moves the sun and other stars”—then Proust is the ultimate artist of the cynical view. For him love was just another false prophet that distracts us from the truth of life and the tranquility of art. And nowadays it is hard to disagree with him.

We have found that love, far from a divine mystery, is the expression of an instinctual drive to procreate. Since stable pair-bonding is helpful for the survival of our children, it makes sense that we would evolve the tendency to fall in love. And now that romantic relationships are more fluid than ever before—with the rise of dating and divorce—we have clear and persistent evidence that even the strongest feelings of love do not necessarily, or even often, lead to permanent relationships.

Indeed, when you observe a person moving from partner to partner, equally in love with all of them in turn, equally convinced that each one is incredible (until he breaks up with them, at which point they become undesirable), then it is hard to resist the conclusion that love is a sort of self-hypnosis. For when we fall in love, we see only perfection in the beloved; and when we fall out of love, we see only ordinary flaws. The conclusion seems to be, as Proust says, that we love what we possess only because we possess it, and see the beloved as extraordinary simply because it is our beloved. This, of course, is an ironical situation, since the “most intimate” of connections appears, upon inspection, to be based on willful misapprehension. The loving eye sees least.

Given these reasons for cynicism, why is the romantic, rosy-eyed view of love so common in our culture? I would even go so far as to say that the cult of love has become a sort of religion. Finding the perfect partner is portrayed as the apex of happiness, the seal and guarantee of a good life.

Now, do not think I am some bitter enemy of love. Anybody who has ever been in love knows that it is one of the best feelings in life. Even so, I think it is unhealthy to dwell so insistently on romantic love, as if it could save us, complete us, perfect us. It is unhealthy, first, because happiness must always come from within us, and not from some external—not even a relationship; and, second, because our inflated notions of love ironically lead us to expect too much from it, which damages relationships.

Though it is a cliché to say so, I think the truth about love lies between the romantic and the cynical view. Neither salvation nor illusion, neither effortless nor impossible, neither invincible nor insubstantial, neither the point of life nor a pointless waste—love is a beautiful but ordinary thing. And art, insofar as it strives to represent reality, ought to try to show love in all its ordinariness.

Quotes & Commentary #64: Goethe

Quotes & Commentary #64: Goethe

And when your rapture in this feeling is complete,
Call it what you will,
Call it bliss! heart! love! God!
I do not have a name
For this. Feeling is all;
Names are but sound and smoke
Befogging heaven’s blazes.

—Goethe

We humans give the name “love” to so many different things that it can be difficult to tell what it means. No word is more overused. Turn on the radio we hear love songs; switch on the television and every show, comedy or drama, has a love story; open a novel, chances are the same is true. We love everything from our children to cheetos, from our friends to our phones. Some people love God and some love Lady Gaga. How can any word accommodate so many different relationships?

Part of the ambiguity comes from our using the word “love” to express three distinct things: feelings, preferences, and values. By “feeling” I mean some emotion felt in the present moment, in this case an emotion of intense pleasure. This is what somebody means when they take a bite of a hamburger and say, “I love this!” We are also expressing a feeling when, with a loved one, we spontaneously say “I love you!” In this sense, the word is purely emotive, comparable to smiling or laughing.

All feelings are, by definition, fleeting and temporary; but we use “love” in calmer moments to express more durable preferences. By “preference” I mean a tendency to enjoy and choose something; this is what somebody means when they say “I love the Beatles.” In less serious moments, we also use the word “love” this way with people, such as when we say “I love my coworkers.” By saying this, the speaker is clearly not expressing any level of commitment to her coworkers; she is only expressing her tendency to enjoy and appreciate their company.

The strongest and, you might say, the most proper use of the word “love” is to express a value. We “value” something when we are willing to act for its sake, enduring inconvenience, pain, or even death in its service. When we value something we identify ourselves with it, making it an extension of ourselves. This is the sort of bond that exists between close friends, family, and romantic partners. And I think it is important to understand love this way, since it explains how it is possible to simultaneously love somebody and be furious at them—which would be contradictory if love were simply a feeling.

Clearly, any good relationship will consist of a combination of these three layers. We feel good in the presence of a loved one, we prefer seeing them, and we value them deeply. Yet is is clearly possible to have one without the others. Specifically, I think a confusion between the emotive and the value aspects of love is what causes people to agonize over the question, “Do I really love x…..?” This is because it is clearly possible to value somebody deeply but to feel angry and hurt in their presence; and conversely it is possible to feel very happy in somebody’s company without being committed to them.

Part of this confusion is unavoidable. This is because it can be difficult to tell how much we really value something. Value is not something we feel and thus is not obvious. Rather, our values are revealed by our actions over a stretch of time. How much time and energy do we devote to somebody? How far are we willing to interrupt our lives for their sake? How consistent is our willingness? We cannot, in other words, simply introspect and feel value. And even when we see that we consistently value something, it is impossible to predict with certainty how long it will last. Thus love, like the rest of life, always requires a leap of faith.

Quotes & Commentary #63: Voltaire

Quotes & Commentary #63: Voltaire

The superstitious are the same in society as cowards in an army; they themselves are seized with a panic fear, and communicate it to others.

—Voltaire

When I was a child I was afraid of ghosts. Coincidentally, at both my mother’s and my father’s house, I had a nextdoor neighbor who very much encouraged the fear. Both were girls, both a couple years older than me, and both told me ghost stories that filled me with wonder and scared me half to death. Once, I remember being so frightened of ghosts in the attic that I begged my mother, with tears in my eyes, not to go up, sure that she would meet some horrible end. (She was miraculously unharmed.) I even went on ghost discovery missions with my neighbor and my brother, in the forest behind my house; we didn’t find anything, but once we took a polaroid in which the sun’s rays, coming through the trees, created an odd aura that looked vaguely ghostlike.

Naturally my superstitious beliefs weakened with age until they left me altogether. Admittedly, living in a very secular part of the country helped. Since those ghost hunting days, I have not personally come into contact with a lot of superstitious behavior. But whenever I have, I am filled with a strange mixture of pity and revulsion, for superstition strikes me as the lowest depth to which the adult human mind may fall. Traditional superstitions are the child’s fear of the dark, of the strange creaks at night, of the unexplained coincidence—in short, fear of the unknown—hardened into a belief handed down the generations. They are socially condoned phobias.

While I am no friend of religion, I can at least sympathize with the comfort provided by a faith in a just and caring God. I can see how a belief in a higher power might ennoble a person and lift them up above circumstances. But superstition, as I understand it, does just the opposite: it shrinks the universe down to petty dimensions, and fills the superstitious with debilitating and needless fears. For to believe that throwing salt over your shoulder, walking over a grave or under a ladder, opening an umbrella indoors or saying some forbidden word, being passed by a black cat or doing something at a certain hour or on a specific day—to believe that these trivial events can significantly influence your life is to give monumental importance to one’s smallest actions, and is thus really a form of egotism.

And how does the belief in ghosts, spirits, demons, devils, monsters, or even “luck” itself, add to your experience of the world? All these are boogeymen who cause us to revert to a state of childlike terror. And what are the consequences of these beliefs? If you believe that certain very normal things are cursed, haunted, or even “bad luck,” you will go through life needlessly avoiding things. Indeed, I admit that it strikes me as an affront to human reason for a person in this century to become nervous because they have spilled salt.

But the most nefarious part of superstitions is not that they are illogical, but that they are socially condoned, often through association with religion. Thus people are not encouraged to test these fears in order to see if they are justified, but exactly the opposite, they are encouraged to obey the fears and never to criticize them. This is what I mean by calling them socially condoned phobias. For an irrational fear in one person is a phobia, to be treated by a psychologist; but in a whole society it is a superstition, to be respected.

In general I think that fear should be combated wherever it isn’t absolutely necessary, for fear limits our options, distorts our views, and shrinks our world. And superstition, being a socially contagious form of irrational fear, is perhaps the worst example of this. Yet having written this diatribe, I must here admit that I enjoy picking up pennies when I find them on the ground. I do not believe they give me good luck, but somehow it feels like winning a prize. What strange stuff we are made of!

Quotes & Commentary #62: Santayana

Quotes & Commentary #62: Santayana

Matters of religion should never be matters of controversy. We neither argue with a lover about his taste, nor condemn him, if we are just, for knowing so human a passion.

—George Santayana

This quote sums up the apparent futility of argument—not only about religion, but about so many things that arouse strong feeling. I have never seen, or even heard of, a discussion about religion or politics that ended with one of the participants being convinced. If anything, conversations about these topics seem only to entrench the opposing parties in their positions.

This occurrence appears common and universal; and yet its implications strike at one of the pillars of western thought—that rational arguments can be used to reach the truth and to convince others—as well as of liberal democracy, which rests on the ideal that, to paraphrase John Milton, truth emerges victorious from open encounters with untruth. If debate is really futile in matters religious (which involves our ultimate views of life and the universe) and politics (which involves our stance on society), then are we doomed to endless tribal bickering based on nothing more than group mentality?

I strongly wish that this wasn’t the case; but I admit that, judging on my actions in daily life, I have little faith in the power of reason in these matters. I tend to avoid topics like religion and politics, even among friends. Powerful emotions underpin these aspects of life; values and identity are implicated; and individual psychology—background, traumas, inadequacies—may render the action far removed from cold calculation.

To a large extent, admittedly, rationality has only a subsidiary role in decision-making. Hume was quite right, I believe, to call reason a “slave of the passions.” We are never motivated by reason alone; indeed I don’t even know what that would look like. We are motivated, instead, by desires, which are organic facts. In themselves, desires are neither rational nor irrational. Rationality only applies, first, when we are figuring our how to satisfy these desires; and, second, when multiple, conflicting desires are at play.

The desires to be skinny and to eat three pints of ice cream a day, for example, conflict with one another, and reasoning is needed to achieve a harmony between these two. A reasoner may realize that, however, delicious ice cream may be, the desire to be skinny is consonant with the strong desire to be healthy and live long, so the ice cream is reduced. Both internally, within our psyches, and externally, within society, reason is how we achieve the most satisfying balance of competing desires.

Since reason rests on a fundamentally non-rational bases—namely, desires—it may be the case that reason has no appeal. In politics, for example, somebody may crave equality, and another person freedom; and no argument could move or undermine these desires, since neither is rational in the first place. Different political orientations are rooted in different value systems; and values are nothing but orientations of desires.

But I think it is often the case that competing value systems have many points in common. Grave inequality can, for instance, curtail freedom; and enforced inequality can do the same. For either party, then, a satisfactory society cannot have absolute inequality, absolute equality, absolute freedom, or absolute slavery. These different values are therefore not totally at odds, but are merely different emphases of the same basic desires, different ways to harmonize competing pulls. And in cases like these, rational argument can help to achieve a compromise.

What about religion? Here the case seems somewhat different from politics, since religion is not just a question of values but involves a view of reality.

Admittedly, political ideologies also involve a certain view of reality. Each ideology comes with its own historical narrative. Sometimes these narratives are nothing but a tissue of lies, as with the Nazis; and even the most respectable political narrative may make some dubious assumptions. Nevertheless, the validity of political opinions is not purely a matter of the truth of their historical narrative. Somebody may genuinely desire communism even if everything they assert about the Soviet Union is wrong; and if debunking their history makes us doubtful of the possibility of satisfying their desire, it does not invalidate the desire itself.

With religion, to repeat, the case is somewhat different, since religions assert some set of facts about the universe; and without this set of facts, the religion falls to pieces. All of Marx’s theories of history may be wrong, but you can still rationally want a communist society. But a Christianity without a belief in a divine Jesus has lost its core. It is no longer a religion. In this way religion is decidedly not like falling in love, contrary to Santayana, since love, being pure desire, makes no assertion about the world.

This seems to put religions on a different footing, since they rest not only on desires, but beliefs. And if these beliefs prove incorrect or irrational, then the religion ceases to make sense. From my readings in history, science, philosophy, and theology, it seems quite clear to me that this is the case: that insofar as religious notions can be disproved, they have been; and insofar as they are unprovable, they are irrational to believe.

Indeed, I think with enough time I could explain this quite clearly to a believer. But I have never tried, since I am almost positive it wouldn’t work—that their religious beliefs would be impervious to argument. I also admit that the thought of doing so, of trying to talk someone out of a religion, makes me feel uneasy. It seems impolite and invasive to try to exert so much pressure on somebody’s fundamental beliefs. And even if I were successful, I believe I would feel somewhat guilty, like I had just told a child that Santa wasn’t real.

But is this uneasiness justified? If religions are truly irrational, based on a mistaken picture of the world, then they can give rise to unjustifiable actions. The religiously inspired fight against gay marriage, climate change, and abortion are excellent examples of this. Furthermore, if people habitually accept an irrational picture of the world, basing beliefs on religious authority rather than reasoned arguments, then perhaps they will be more easily manipulated by unscrupulous leaders.

On the other hand, living in a liberal society requires tolerance of others’ beliefs, rational or otherwise. And living in a polite society requires that we respect even when we do not agree. So it seems that a balance must be struck between arguing against an irrational belief and keeping considerate silence.

Quotes & Commentary #61: Santayana

Quotes & Commentary #61: Santayana

To abolish aristocracy, in the sense of social privilege and sanctified authority, would be to cut off the source from which all culture has hitherto flowed.

—George Santayana

Though I do not share Santayana’s sanguine attitude towards the aristocracy, I think this quote does bring up a vital point: the relationship between art and its patrons.

Nowadays we take it for granted that artists make their money the way that anyone else does, by trading their services on the open market. The buying public—concert-goers, music purchasers, companies that need songs for commercials, and so on—is the ultimate art patron. But this has not historically been the case. Wealthy institutions and affluent individuals have more commonly played this role. So what does this shift from artistic feudalism to capitalism signify?

This question is far more than merely financial. For the artist, however proud and independent, cannot help but be influenced by their audience and supporters.

It is easy to deprecate the vulgarity of popular art in the age of capitalism, but I am not sure aristocracy was much better. Goya’s most profound works are not his portraits of his aristocratic confreres, however excellent these may be; and the same goes—with some extremely notable exceptions—for Velazquez’s many portraits of the royal family. There is no logical reason why an aristocracy of power and wealth should also be an aristocracy of taste.

True, hereditary aristocrats, freed from laborious duty, do have more free time to devote to artistic appreciation. Without the necessity to make their way in the world, they may decide to compete in aesthetic refinement or in sponsoring living artists. This is possible, to be sure, and has happened many times in history. But this method of patronage—private, wealthy individuals—can easily lead to self-aggrandizement; the art it fosters, by being too allied to worldly wealth and earthly power, becomes yet another form of conspicuous consumption.

Perhaps the greatest art patron in western history has been, not royals or nobles, but the church. In music and the visual arts, at least, religious patronage has led to some of the greatest accomplishments in our history: the works of Palestrina, Bach, El Greco, Michelangelo. The advantages of church patronage are clear. An institution of enormous wealth, it can recruit the best artists and all the resources they need. More than that, though also prone to self-aggrandizement, the spiritual aims of religious art free it from the worldliness of the hereditary aristocracy.

Even more important, perhaps, is the continuity of tradition fostered by religions: establishing subjects, tropes, styles, and techniques, that are refined and passed down through the ages. How could Bach have written his Mass in B Minor, or Michelangelo conceived the Sistine Chapel, if they had not been the beneficiaries of hundreds of years of religious tradition?

Granting the church its honorable place in the history of art, we may, however, still admit that religious patronage can lead to a sterile conformity. There are only so many ways, and only so many emotions, that can be portrayed in a Madonna and Child; and, in any case, the church will not prove congenial to nonreligious artists—of which history is full. The Dantes of this world may find in the church all they need; but a Rabelais can never be so satisfied. Inevitably a religious organization will overlook or squelch some aspects of the human experience.

This became clear when a new patron emerged in history, far removed from the grandeur of nobility or the magnificence of the church: namely, the mercantile middle-class. The prime example of this are the paintings of the Dutch Golden Age. All patrons, to an extent, like to see themselves represented in what they patronize; thus the newly powerful Dutch capitalists gravitated towards private portraits and intimate scenes of daily life.

The artistic advantages of this shift in patronage are obvious, opening up unexplored vistas for artists to explore. Neither an aristocrat nor a clergyman, for example, would think of buying a work like Vermeer’s The Milkmaid—a work that has nothing to do with aristocratic virtues or spiritual consolations. Of course there is a downside to this, since the qualities that help merchants succeed have nothing to do with artistic appreciation; and, even if guided by exquisite taste, bourgeois art pays for its wider scope with limited depth. It is an art of prose, not poetry, with a weaker tradition to guide it and quotidian values to embody.

The ideal situation may be a mixture of patronage, such as was the case with Shakespeare. Having every rung of society as his audience, from paupers to princes, he had to strive for universality—and obviously succeeded. But the bard was clearly an exceptional case. Striving to please everyone can easily turn into pleasing nobody in particular, creating something bland and unobjectionable. While the particular taste of patrons may be constraining for some artists, it may help many others to focus.

In recent years the university has become a major source of patronage, especially for musical composition. The advantage of this is clear in an age when the general public has little to no interest in art music. But the nature of the university, as an institution, can also have negative artistic repercussions. Unlike a church, guided by spiritual values, a university is above all a place of exploration; and thus academic music tends to be experimental. Experimentation is usually an artistic virtue; but when cut off from any common set of aesthetic ideals, art degenerates into intellectual exercise.

The economy of the visual arts has diverged quite radically from either literature or music. In the age of mechanical and digital reproduction, the physical uniqueness of a painting has made it an ideal collectors item. Thus paintings are once more a form of conspicuous consumptions, with wealthy patrons spending millions on single works. And unlike in former times, when the aristocrats were the only ones able to afford art, the easy access to reams of high-quality art puts pressure on them to distinguish themselves through extreme taste as well as extreme expenditure. Given that the prize can be so huge, this is an irresistible incentive to inaccessibility—the competition to appreciate the unappreciatable.

The book and music industries are, by contrast, dominated by a relatively small number of giant publishing houses and record companies. Being companies, their fundamental motivation is profit. This makes them naturally risk-averse, since it is always safer to reproduce success than to bet on something different; and this encourages a conformity to commercially successful styles and topics. Acting as gatekeepers to fame, these companies can therefore exert a standardizing influence. And for obvious reasons these companies favor simple, popular styles in order to maximize their clientele.

But does an artist even need a patron? What about the Dickinsons, the van Goghs, and the Kafkas of the world, toiling away, unknown and unsuccessful, in some remote corner? It is true that many great artists never managed to make a living off their art during their lifetimes, relying on extraneous work or their families for support. And this arrangement does have the key advantage of allowing the artist to pursue her individual vision, without having to adapt her work to any foreign tastes, preserving her originality whole and entire.

Yet even this blessing is not unmixed. For patronage, if it subjects artists to sometimes undesirable pressure, can also give artists the direction and external challenge they need. Not every artist is self-sufficient enough to work in silent obscurity, following the bent of their own genius. The structure imposed by patronage can turn a vague or self-involved aesthetic impulse into a focused piece.

It may seem sordid to think of art in these monetary terms; but, as I hope I have shown, this is not a purely aesthetic question. Part of what gives any age is characteristic art is the way that artists make their living. The internet is now opening new possibilities for artistic entrepreneurs. The ultimate aesthetic effects of this new medium are only just beginning to appear.

Quotes & Commentary #60: Santayana

Quotes & Commentary #60: Santayana

We read nature as the English used to read Latin, pronouncing it like English, but understanding it very well.

—George Santayana

This simile about relation between human knowledge and material fact expresses a deep truth: to understand nature we must, so to speak, translate it into human terms.

All knowledge of the world must begin with sensations. All empirical knowledge derives, ultimately, from events we perceive with our five senses. But I think it is a mistake to confuse, as the phenomenalists do, these sensations for reality itself. To the contrary, I think that human experience is of a fundamentally different sort as material reality.

The relationship between my moving finger and the movement of the string I pluck is direct: cause-and effect. The relationship that holds between the vibrations in air caused by the guitar string, and the sound we perceive of the guitar, is, however, not so direct. For conscious sensations are not physical events. You cannot, even in principle, describe the subjective sensation of guitar music using physical terms, like acceleration, mass, charge, etc.

The brain represents the physical stimulus it receives, transforming it into a sensation, much like a composer represents human emotions using notes, harmonies, and rhythms—that is, arbitrarily. There is no essential relationship between sadness and a minor melody; they are only associated through culture and habit. Likewise, the conscious perception of guitar strings is only associated with the vibrations in the air through consistent representation: every time the brain hears a guitar, it creates the same subjective sensation. But the fact remains that the vibrations and the sensation, if they could be compared, would have nothing in common, just as sadness and minor melodies have nothing in common.

I must pause here to note a partial exception. In his Essay Concerning Human Understanding, John Locke notoriously makes the distinction between primary and secondary qualities. The latter are things like color, taste, smell, and sound, which are wholly subjective; the former are things like size, position, number, and shape: qualities that are inherent in the object and independent of the perceiving mind. Berkeley criticized this distinction; he thought that all reality was sensation, and thus there was no basis in distinguishing primary and secondary—both only exist in human experience. Kant, on the other hand, thought that reality in-itself could not, in principle, be described using any terms from human experience; and thus primary and secondary qualities were both wholly subjective.

Yet I persist in thinking that Locke was rather close to the truth. But the point must be qualified. As Einstein showed, our intuitive notions of speed, position, time, and size are only approximately correct at the human scale, and break down in situations of extreme speed or gravity. And we have had the same experience with regard to quantum physics, discovering that even our notion of location and number can be wholly inaccurate on the smallest of scales. Besides these physical consideration, any anthropologist will be full of anecdotes of cultures that conceive of space and time differently; and psychologists will note that our perception of position and shape differs markedly from that of a rat or a bat, for example.

All this being granted, I think that Locke was right in distinguishing primary from secondary qualities. Indeed, this is simply the difference between quantifiable and unquantifiable qualities. By this I mean that a person could give an abstract representation of the various sizes and locations of objects in a room; but no such abstract representation could be given of a scent. The very fact that our notions of these primary qualities could be proven wrong by physicists proves that they are categorically distinct. A person may occasionally make a mistake in identifying a color or a scent, but all of humanity could never be wrong in that way. Scientists cannot, in other words, show us what red “really looks like,” in the same way that scientists can and have shown us how space really behaves.

Nevertheless, we have discovered, through rigorous experiment and hypothesis, that even these apparently “primary qualities”—supposedly independent of the perceiving mind—are really crude notions that are only approximately correct on the scale of human life. This is no surprise. We evolved these capacities of perception to navigate the world, not to imagine black holes or understand electrons. Thus even our most accurate perceptions of the world are only quasi-correct; and there is no reason why another being, adapted to different circumstances, might represent and understand the same facts quite differently.

It seems clear from this description that our sensations have only an indicative truth, not a literal one. We can rely on our sensations to navigate the world, but that does not mean they show us the direct truth. The senses are poets, as Santayana said, and show us reality guised in allegory. We humans must use our senses, since that is all we have, but in the grand scheme of reality what can be seen, heard, or touched may be only a miniscule portion of what really exists—and, as scientists have discovered, that is actually the case.

To put these discoveries to one side for a moment, there are other compelling reasons to suspect that sensations are not open windows to reality. One obvious reason is that any sensation, if too intense, becomes simply pain. Pressure, light, sound, or heat, while all separate feelings at normal intensities, all become pain when intensified beyond the tolerance of our bodies. But does anybody suspect that all reality becomes literal pain when too severe? When intensified still further, sensation ceases altogether with death. Yet are we to suppose that the stimulus of the fatal blow ceases, too, when it becomes unperceivable?

Of course, nobody makes these mistakes except phenomenologists. And when combined with other everyday experiences—such as our ability to increase our range of sight using microscopes and telescopes, the ability of dogs to hear and smells things that humans cannot—then it becomes very clear that our sensations, far from having any cosmic privilege, represent only a limited portion of the reality, and do not represent the truth literally.

What we have discovered about the world, since the scientific revolution, only confirms this notion. Our senses were shaped by evolution to allow us to navigate in a certain environment. Thus, we can see only a small portion of the electromagnetic spectrum—a portion that strongly penetrates our atmosphere. Likewise with every other sense: it is calibrated to the sorts of intensities and stimuli that would aid us in our struggle to survive on the struggle of the earth.

There is nothing superstition, therefore, or even remarkable in believing that the building blocks of reality are invisible to human sensation. Molecules, atoms, protons, quarks—all of these are essential components of our best physical theories, and thus have as much warrant to be believed as the sun and stars. From a human scale, of course, there is a strong epistemological difference: they form components of physical theories; and these theories help us to make sense of experience, rather than constitute experience itself.

But that does not make them any less real. Indeed, our notion of an atom may be closer to nature than our visible image of an apple, since we know for sure that the actual apple is not, fundamentally, as it appears to human sight, while our idea of atoms may indeed give a literally accurate view of nature. Indeed, the view of sensations that I have put forward virtually demands that the truth of nature, whatever it is, be remote from human experience, since human experience is not a literal representation of reality.

This leads to some awkwardness. For if scientific truth is to be abstract—a theorem or an equation remote from daily reality—then what makes it any better than a religious belief? Isn’t what separates scientific knowledge from superstitious fancy the fact that the first is empirical while the latter is not?

But this difficulty is only apparent. Santayana aptly summarized the difference thus: “Mythical thinking has its roots in reality, but, like a plant, touches the ground only at one end. It stands unmoved and flowers wantonly into the air, transmuting into unexpected and richer forms the substances it sucks from the soil.” That is to say that, though religious ideas may take their building blocks from daily life, the final product—the religious dogma—is not fundamentally about daily life; it is a more like a poem that inspires our imaginations and may influence our lives, but is not literally borne out in lived experience.

A scientific theory, on the other hand, is borne out in this way: “Science is a bridge touching experience at both ends, over which practical thought may travel from act to act, from perception to perception.” Though a physical theory, for example, is itself something that is never itself perceived—we never “see” Einstein’s relativity in itself—using it leads to perceivable predictions, such as the deviation of a planet’s orbit. This is the basis of experiment and the essence of science itself. Indeed, I think that this is an essential quality of all valid human knowledge, scientific or not: that it is borne out in experience.

Like quantum physics, superstitious notions and supernatural doctrines all concern things that are, in principle, unperceivable; but the different is that, in quantum physics, the unperceivable elements predict perceivable events with rigid certainty. Superstitious notions, though in principle they have empirical results, are usually whimsical in their operation. The devil may appear or he may not, and the theory of demonic interference does not tell us when, how, or why—which gives it no explanatory value. Supernatural notions, such as about God or angels or heaven, are either reserved for another world, or their operation on this world are too entirely vague to be confirmed or falsified.

So long as the theory touches experience at both ends, so to speak, it is valid. The theory itself is not and cannot be tangible. The fact that our most accurate knowledge involves belief in unperceivable things, in other words, does not make it either metaphysical or supernatural. As Santayana said, “if belief in the existence of hidden parts and movements in nature be metaphysics, then the kitchen-maid is a metaphysician whenever she peels a potato.”

Richard Feynman made almost the same point when he observed that our notion of “inside” is really just a way of making sense of a succession of perceptions. We never actually perceive the “inside” of an apple, for example, since by slicing it all we do is create a new surface. This surface may, for all we know, pop into existence in that moment. But by imagining that there is an “inside” to the apple, unperceived by equally real, we make sense of an otherwise confusing sequence of perceptions. Scientific theories—and all valid knowledge in general—does essentially the same thing: it organizes our experience by positing an unperceived, and unperceivable, structure to reality.

Thus humanity’s attempt to understand nature is very accurately compared to an Englishman reading Latin with a London accent. Though we muddle the form of nature through our perception and our conception, by paying attention to the regularities of experience we may learn to understand nature quite well.

Quotes & Commentary #57: Santayana

Quotes & Commentary #57: Santayana

History is nothing but assisted and recorded memory. … Memory itself is an internal rumour…

—George Santayana

I tend to be distrustful of my own faculties.

As I mentioned in my last post, the reliability of my own senses (or lack thereof) is something that has long troubled me. It is not that my senses are themselves deficient; though I wear glasses my eyesight is not bad, and my hearing not much worse. Partially, it is my lack of attention. Most days I am in my own world and overlook what is right in front of my eyes. What use is good eyesight if the eyes are vacant?

Beyond this tendency to drift off mentally, I lack faith our ability to sense the truth. Vast regions of the electromagnetic spectrum are hidden from human sight. A dog’s nose or a cat’s ears put our primate parts to shame. And even the finest organs of sensation are inadequate to probe the finest reaches of matter, the basic building blocks of reality.

I also distrust my memory. This is not without reason. Experience after experience testifies that my memory is not to be blindly trusted.

When I was younger I had the nasty habit of stealing one of my friend’s jokes. He was much funnier than me, and the temptation to repeat what he said proved irresistible. The problem was that I would do it in his presence, and without crediting him. This may sound ungrateful; but the truth is that, in the moment, while telling the joke, I would forget that he had said it. My memory had wishfully appropriated the joke, and somehow I had convinced myself that it was I who had said it first. His complaints would destroy the illusion, and momentarily give me a sickening sense of self-doubt. Memory had played a trick on me.

Yet memory, for all its failings, is normally remarkably dependable. We remember our address, our friends’ names, our schedule, how to tie shoes and to spell words, the plots of our favorite novels and the lyrics of our favorite songs, and a thousand other things. Life as we know it would be simply impossible if we did not. Most of this remembering, it is true, is done unconsciously, in the form of habit. Indeed, the automatic responses of habit, being born of repetition, are far more dependable than our attempts to remember isolated events.

This is where Santayana’s remark, that memory is nothing but an “internal rumour,” rings most true: in trying to recall something that only happened once. In these cases, without a routine or a habit to tie it to, the lone memory must stand on its own power, with nothing to corroborate it. How can we be sure we are not mistaken? In my experience, we often are. If you have ever heard somebody narrate an event at which you were present, you may have had the same experience: the selfsame event can be completely unrecognizable in someone else’s telling.

I find these situations particularly distressing, since there is usually no way to decide which version of the event is correct, your own or the other person’s. The reason for this unreliability is not difficult to discern, I think. Memory lies on the same spectrum as imaginative fancy; and what we remember, and how we remember it, is shaped by our own constitution: what we notice in the moment, how we interpret what we notice, what we find amusing or interesting, all of this colored by our own vanity, our hobby horses, our desires, and our insecurities.

Memories are not just filed away and pulled out. Subsequent experience affects them. To pick just one example, the urge to dramatize a story—exaggerating some details, omitting others, shifting around the order of events—in the telling and retelling is, for me, impossible to resist. Yet strangely, I have found that the act of dramatizing a story actually changes my memory of it, until the warped version is all that is left.

One major problem is that good stories are memorable in a way that unadorned reality is not. Life simply flows on, a long chain of events with no beginning, middle, or end, and in itself reality has no meaning or moral. Memory selects from this tissue of events and weaves them together with its own logic. You might call this “narrative logic,” the logic of stories: with a defined setting, a discrete cast of characters, a climax and a resolution: events with human meaning and emotional resonance.

I think that narrative logic is highly artificial, a perceived order imposed on events by the wishful mind, and that its substance is pure feeling. It is true, as I have written before, that this tendency can be unhealthy. But we humans, as emotional creatures, need these narratives: they give us a sense of purpose and direction. Nearly everyone, to some extent, has their own personal mythology, a group of stories about themselves that explain who they are and where they’re going. 

Narrative logic also governs politics. As Jonathan Haidt observed, every political ideology has its own narrative. The American left has a narrative based on overcoming bigotry, and the American right a narrative about oppressive government; the Nazis, the Soviets, the Spanish anarchists all had their own narratives. These narratives normally have a degree of historical validity, being based on some real events and actual facts. But the interpretation given to the events and facts is inevitably dubious; and, even more importantly, the facts and events left out, silently omitted, are what give the narrative its sense of direction and purpose. Historical forgetting is crucial to political grouping.

Santayana is, of course, correct in a general sense that history is nothing but assisted and recorded memory. Without memory, we could do nothing characteristically human—neither read nor write, much less write history. But what this quote leaves out is that memory, once catalogued and recorded in cold records rather than a living brain, changes its character.

Technology helps humans overcome our biological limitations. Just as the microscope and the microphone make our normally untrustworthy senses more dependable, so does the act of writing compensate for the failings of our memory. What is written down is no longer subject to the passions, fancies, and revisions of our imagination. Thus, by basing our records on documentation—whether it be cuneiform tablets or government records, or archeological remains—we may break the spell of narrative logic.

Of course, we may choose what records to read, and then choose what to write in our history books, and we may choose to selectively read those history books, and so on. Humans are remarkably resilient to facts that do not accord with their worldviews. Indeed, as we have seen with the proliferation of fake news and the information bubbles that partition the country ideologically, this problem is just as acute as ever. As Santayana went on to say:

History is always written wrong, and so always needs to be rewritten. The conditions of expression and even of memory dragoon the facts and put a false front on diffuse experience. What is interesting is brought forward as if it had been central, and harmonies are turned into causes. … Such falsification is inevitable, and an honest historian is guilty of it only against his will.

All of this notwithstanding, it is also true, I think, that the discipline of history makes progress in breaking the spell of these political narratives—at least enough to make them more honest, more just to the facts. This is how it should be. To fabricate the historical record, as the Nazis and the Soviets did, opens the door to extremism. But we could not live, either individually or socially, without any narrative to guide us. As long as humans look for meaning in life or group together behind a cause, they will tell stories about themselves. And these stories will inevitably rest upon the internal rumor of memory.

Both philosophy and history are, among other things, forms of criticism; and as such their function is to “surprise the soul in the arms of convention” (to use another of Santayana’s felicitous phrases). This criticism and the skepticism it entails are healthy and desirable. But we cannot remain thus untethered. The point of criticism is not to achieve pure skepticism but to give us enough mental and emotional distance to be able to consciously choose our ideals and stories, and not have them chosen for us by convention.  

Quotes & Commentary #53: Goethe

Quotes & Commentary #53: Goethe

Not all that’s foreign can be banned

For what is far is often fine.

A Frenchman is a thing no German man can stand,

And yet we like to drink their wine.

—Goethe, Faust

So long as humans have divided themselves into groups, xenophobia has existed. Like many phobias—such as of spiders, snakes, and heights—fear of foreigners has an evolutionary logic. In a time before laws, city walls, and police, when small migratory bands of hunter-gatherers roamed the world, strangers were an acute threat. Violence within one’s own group could be reduced through interdependence and social pressure; but there was comparatively little to deter violence between groups. As such, it made sense to be fearful of strangers, just as it made sense to fear poisonous critters and deadly falls.

But the human environment changes faster than the human mind can evolve. Our fears are often maladjusted to the modern world. We panic when we see rats, bats, cockroaches, and we feel queasy on tall buildings. Yet how many people have phobias of cars or guns, two far more deadly facets of the modern world? Not many, and that’s the point: our brains are attuned to different threats than now exist. The same logic applies to foreigners. The old fear of strangers, once useful and life-preserving, has in our day of nation-states transformed into useless a fear of foreigners. And as everybody in the United States knows, this fear has recently experienced a resurgence.

Xenophobia is nothing new in America. We were never so accepting of immigrants as our national mythology would have us believe. There have been periods of backlash against many different ethnic groups: Germans, Irish, Chinese, and now immigrants from Latin America and from predominately Muslim countries. That this xenophobia is based on provably irrational fears—rampant crime, “job stealing,” or terrorism—hardly affects the deep-rooted emotional response to foreigners. And whipping up sentiment against outsiders, after all, is the easiest thing in the world, since outsiders have no social bonds to the community.

Yet however deeply rooted the fear is in our psychology, it is not ineradicable. A fear of insects is another of our predisposed phobias, since poisonous insects were daily perils for our ancestors. But when I was in Kenya, constantly exposed to legions of flying, crawling, stinging, biting bugs, I soon lost my fear and felt perfectly at home. I ceased to be afraid once I realized that my fear was irrational: the bugs were safe, so long as I didn’t do anything stupid. Similarly, living in an international city like New York reduces xenophobia through daily contact. An irrational fear quickly dissipates when prolonged experience exposes the fear’s lack of basis in reality.

I do not mean to be overly simplistic. Obviously other factors than our primitive wiring affect xenophobia. In the case of Germany and France, for example, those two states competed for resources and power, leading them into conflict and stirring up hatred. And this hatred, combined with political and language barriers, was—despite living in close proximity—sufficient to motivate the populations of those two countries to kill one another in huge numbers, just for their sake of identity. Obviously, proximity by itself is not enough to overcome xenophobic hatred. Both groups must see each other, not as competitors, but as collaborators, with something positive to contribute to one another.

As Goethe points out, I think that cuisine has played a surprisingly central role in promoting inter-group harmony. It is said that music is an international language, but I think food and alcohol better deserve that title. Ingredients, dishes, delicacies, gourmet products, and culinary techniques have traveled far and wide. When it comes to fear of foreigners, perhaps our stomach bypasses our brains. Even the most virulent American nationalist, I suspect, enjoys the occasional Chinese take-out. Food is universal; and sharing food, breaking bread together, is a universal sign of peace.

In the heady days of Trump’s campaign, one of his supporters, Marco Gutierez, warned that, if the Mexicans weren’t pushed out, there would be “a taco truck on every corner.” Perhaps this is exactly what we need, as attacks on immigrants’ rights increase daily.