Quotes & Commentary #83: Austen

Quotes & Commentary #83: Austen

There is something so amiable in the prejudices of a young mind, that one is sorry to see them give way to the reception of more general opinions.

Jane Austen

Part of getting older—in my experience at least—is becoming more “normal.” Of course, “normal” is hard to define, and its definition always depends on social context. But basically, I mean behaving in a way that doesn’t make you stand out, as well as having beliefs that fall within the mainstream (Jane Austen’s “general opinions”).

For better or worse, by this generic definition, I find myself becoming more normal with each passing year. And I am continually reminded of this in my role as an educator. Teenagers like to push limits, and I often hear things which no “normal” person would say or even think—said just to provoke a reaction. These adolescent provocations are certainly not endearing.

But youth also includes a certain naïveté, in which opinions are unbound by considerations of what is practical or possible. Usually these opinions are absurd, but sometimes there is a spark of creativity that, I feel, I and many adults have lost. (Though to be fair, most of their theories of how to improve society involve abolishing school.)

And although I am not so sure about the idea—commonly bandied about—that schools are designed to beat out creativity, it is certainly true that schools are designed to establish a certain level of normalcy among their students. A student studying a standard curriculum, frequently mingling with their neighbors, will almost necessarily be more “normal” than somebody who, say, was homeschooled in a cabin on the prairie.

There is certainly a strength in “weirdness”—the ability to see things differently, to think outside normal paradigms, and perhaps even to push society forward. But there is such a huge social and economic benefit to normalcy that I think it would be remiss in educators not to try to at least guide students in that direction. And, in any case, a certain social baseline is obviously necessary if people are to live and work together.

Whether educated at home or in a public school, however, becoming a working adult requires most of us at least the ability to appear “normal”—dressing and acting in ways that fall within some margin of acceptability. True, the range of what is considered acceptable is growing wider in some respects, particularly in terms of appearance, as dress codes become less formal and, for example, tattoos become more common.

But in other respects, such as what opinions can be expressed without fearing an adverse reaction, I don’t think that we are any more tolerant of weirdness now than we were in the past. And given that reality, it behooves most of us to lose the “prejudices of a young mind,” as Austen says, and adopt the pleasantries of an adult brain to get along in life.

Yet this isn’t the whole story. Another thing I’ve noticed as I’ve gotten older and more “normal” is that, at a certain point, people regress into weirdness. Specifically after retirement, I’ve noticed (not to point fingers at anyone in particular) that people can develop zany opinions and odd behavioral ticks. It is as if it is only the constant pressures of school and then work are what keep people “normal,” and as soon as those pressures ease off, the weirdness comes rushing back. And, to keep to Jane Austen’s theme, this weirdness often manifests itself in prejudices and opinions that are far from “general.”

One might think that a lifetime of experience might insulate one’s mind against nonsense. But the passing years seem to make many people, if anything, more susceptible to unrealistic or outrageous beliefs.

I suppose it is not a novel observation that older folks can fall victim to scams, conspiracy theories, or simple superstition. But I do find it mildly depressing that age, far from conferring wisdom, can involve becoming unpresentable at parties. 

To put the language in Jane Austen’s terms, while the prejudices of a younger mind may be “amiable,” those of an older mind are typically quite the reverse. But I suppose both deserve sympathy, if for different reasons.

Quotes & Commentary #82: Tolstoy

Quotes & Commentary #82: Tolstoy

In historical events great men—so called—are but the labels that serve to give a name to an event, and like labels, they have the least possible connection with the event itself.

Leo Tolstoy

Anyone who has made it to the end of War and Peace will remember the strange sensation of finishing one of literature’s most epic stories and immediately being thrown into—of all things—an essay on the philosophy of history. With your heart throbbing with emotion for the characters who made it to the end of the novel, you are hardly in a fit frame of mind for considering the deep mechanisms of historical progress.

Tolstoy himself may not even have been in a fit frame of mind, as his essay—while well-written and interesting—is not exactly persuasive. Indeed, the famous philosopher Isaiah Berlin wrote one of his most famous philosophical essays, “The Hedgehog and the Fox,” analyzing why a man as brilliant as Tolstoy could put forward a work of analysis that does not really hold together. In Berlin’s opinion, Tolstoy is a classic “fox” trying to be a “hedgehog”—meaning, that although Tolstoy’s brilliance was expansive and intuitive, born of his great gift for empathy, he longed to be a systematic thinker who could subject his life to logical conclusions.

Tolstoy’s particular gifts aside, let us turn to the main argument of his essay—namely, that the “Great Man” theory of history is a mistake. Tolstoy argues that, although we conventionally ascribe major historical events to the decisions of certain powerful individuals, it is more accurate to think of history as the product of a countless number of choices and actions by everyone involved. To use Tolstoy’s own example of Napoleon, he argues that, although Napoleon is treated as a kind of mortal god who changed European history—a military genius who defeated so many foes—in reality he was usually unaware of what was happening during his major conflicts, and entirely powerless to influence the outcomes of his battles.

Now, I think nearly everyone would say that Tolstoy pushes his point too far. I doubt many historians would be willing to argue that Napoleon was not a particularly important man, who certainly did exert an influence on the outcome of his battles. Nevertheless, the opposite opinion —say, that Napoleon was wholly in control of his destiny—can also easily be taken too far. Indeed, as far as I can tell, amid historians nowadays there is quite a bit of hostility to the “Great Man” view of history.

I hate to be the sort of person to argue for a middle ground. “It is a little of both” is the oldest intellectual cop out in history. Nevertheless, it does seem logically inescapable that “men” (or, to be a little more inclusive, “people”) are both products of their times and makers of events. The more interesting question is to what extent any given individual is crucial to the shape of history.

The most convincing proponent of the “Great Man” view of history I know is Robert Caro (whose tomes make even War and Peace seem lightweight). In his biography of Bob Moses, for example, Caro makes a convincing case that Moses was a uniquely gifted administrator—an expert in the accumulation and deployment of political power. The very shape of New York City—its many highways, bridges, and tunnels, its parks and housing developments—is, for better or worse, a testament to Moses’s influence.

But a skeptic might say that, as special as Moses might have been, cities all over the United States implemented similar programs—bulldozing neighborhoods for highways, demolishing buildings to create high-rise public housing. If Moses was really so special, then why did he merely accomplish what was basically accomplished all around the country by far less famous individuals?

This rebuttal works in the abstract but not the concrete (pun intended). In Caro’s telling, you can see exactly how Moses subverted rules, bypassed regulations, and bent politicians and contractors to his will, in a way that was completely unprecedented. After witnessing his machinations, it is very difficult not to be convinced that, at the very least, Moses’s specific personality and prejudices carried historic weight.

This is not to take the opposite view, that so-called “Great Men” are the only ones who count in history. It is just to make the claim that, while every individual can exert some influence on the shape of history, some individuals wield considerably more influence than others. And while this may feel like a cop-out, arguably both Tolstoy’s view or the opposite extreme are anti-humanistic—the former, because individual qualities are held to play no role in history, and the latter, because the majority of humankind are reduced to automatons carrying out the will of a few geniuses on top.

I will cease to belabor this point—which I suspect will seem rather obvious to most—as it is just another form of the “chicken and egg” problem. I will only add that, as an amateur student of history, I think it is greatly rewarding to consider the individual experiences of both the major players (the so-called “Great Men”) and the supposedly “ordinary” people on the bottom. It all has much to teach us.

Quotes & Commentary #81: Pirsig

Quotes & Commentary #81: Pirsig

We have artists with no scientific knowledge and scientists with no artistic knowledge and the results aren’t just bad, they’re ghastly.

Robert Pirsig

Among the many woes of American higher education nowadays, one is the precipitous decline of the humanities. Students these days, apparently, are opting for science, business, or engineering degrees, rather than the liberal arts. And as administrators slash budgets in history and literature departments in response to this declining enrollment, some writers and educators have stepped forward to defend this ancient, noble pursuit.

David Brooks attempted a sort of defense recently, in his column “How to Save a Sad, Lonely, Angry and Mean Society.” His argument was essentially that exposure to great works of art develops empathy, as great art trains us to see the world through different perspectives. It is Brooks’s belief and hope that such exposure translates to moral behavior. After all, if we can appreciate the needs, thoughts, and beliefs of others, we will certainly be more kind to them.

I would very much like to agree with this argument. But I have a hard time swallowing it. For example, one of the artists that Brooks mentions is Pablo Picasso, whose great painting Guernica arguably improves its viewers by making the horrors of war viscerally palpable. Picasso himself was, however, a notorious abuser of women, despite being as steeped in art as a person can possibly be. Indeed, history is so replete with cultured criminals—many prominent Nazis were highly educated connoisseurs, to pick just one notorious example—that the notion of betterment through studying the humanities can seem rather silly.

And yet, it is difficult for me to entirely let go of this idea. As a counterpoint, I might mention the American Secretary of Defense, Robert McNamara. After watching Errol Morris’s wonderful documentary about McNamara, one is left with the impression of a man dominated by instrumental thinking. That is, McNamara is always concerned with the how of any question—and he is content to let his superiors worry about what he is doing, why he is doing it, and whether he should be doing it in the first place. Put another way, I think McNamara illustrates the limitations of a purely technical mind—even a brilliant one—as he attempts to make a stupid, immoral war machine as efficient as possible.

Perhaps it is fairer to say, then, that the humanities, while not sufficient for moral behavior, are a necessary condition of it. Or perhaps one must make the even weaker argument that, in general, exposure to philosophy, literature, history, and the arts tends to make us more moral. Or perhaps we might even have to take one further step back, and resign ourselves to saying that these subjects give us an opportunity to at least consider how we could become more moral. If that isn’t convincing, then we ought to just admit that the humanities are valuable simply because they make life more pleasant and interesting—which should be enough, anyway.

What does seem quite clear to me is that all the humanities and arts in the world will not be enough to extricate us from the moral morass that the United States—and, to a worrying extent, much of the rest of the world—seems to have fallen into. Individual enlightenment, even if it is achievable, does not stand much of a chance against collective stupidity. As dirty and disheartening as it is, we must participate in politics as partisans if we want to create a better world.

In any case, I think the decline in student enrollment in the humanities should not be ascribed simply to the deterioration of our culture or the coarse values of the new generation. A huge part of the explanation is simply cost. It is one thing to, say, hold history or philosophy in high esteem, but quite another thing to decide to go into thousands of dollars of debt to acquire such knowledge, with no assurance of a decent job on the other end. Expensive universities only make financial sense if they lead to a good career. (Having studied anthropology, I am in no position to be moralizing on this topic.)

In many ways, the university system here in Spain seems more logical to me. Rather than living on a luxurious campus and indulging in the life of the mind, most university students here commute from home, pay a modest fee, and learn exactly what they need to work in a specific job. In other words, it is job training, pure and simple (at least for most people).

And yet, I am old-fashioned enough to think that there is something good and valuable in the old liberal arts model of education, even if it is difficult to justify on economic grounds. Like Pirsig, I shudder to think of a world where people are only familiar with their own specialty, be it science or art. Education should not be reduced to technical training, or we will be left with a society of people unable to think about problems beyond the narrow domain of their fields. But how can the humanities be kept alive amid the ballooning cost of universities and the dwindling job opportunities of the market? This is a question beyond my ken.

Quotes and Commentary #80: Schopenhauer

Quotes and Commentary #80: Schopenhauer

Whoever takes up and seriously pursues a matter that does not lead to material advantage, ought not to count on the sympathy of his contemporaries.

Arthur Schopenhauer

Despite the greed, grubbiness, and graft associated with capitalism, looked at in a certain light it can appear positively utopian. Certainly many economists and centrist politicians have thought so. In a free market there is no such thing as inherent value. No authority, not even divinely ordained, can determine that something is worth paying for. The only true test of worth is whether people want it, and how much they are willing to spend to get it. That’s it. You can even argue that truly pure capitalism—with perfectly free consumers in a perfectly open market—is a kind of existentialist paradise, where every person determines their values through their own decisions (specifically, by deciding what to buy).

Of course, as any behavioral psychologist, marxist, anthropologist, or clear-eyed person will tell you, this paradise of free choice is very far from the reality we live in. Nevertheless, I think that many of us internalize the idea that value is determined via the market—not only personally (as the existentialists might have it), but objectively. If a song is #1 in the charts, for example, then it must be good by definition. Anything people choose to spend money on simply must be better than what they choose to ignore. By extension, any activity that does not make a profit is, objectively, a waste of time. Money is the ultimate arbiter.

Now, I am not against making money. But I am opposed to the idea that an activity must bring a profit in order to be worth seriously pursuing. A good hobby should, above all, bring pleasure to oneself. Money is a bonus. 

In many ways the internet has ushered in a golden age of hobbies, by allowing networks to form among practitioners across vast distances and making available resources that previous generations could scarcely dream of. Birdwatching, for example, used to be done in solitude or, at most, in a local group, with only a guidebook as a resource. Now apps can identify birds by photo or call, or notify users of a certain species in an area, pooling the collected knowledge of the entire community. 

But the internet has also made it possible to monetize these hobbies—or try to. Whether taking photos, making paintings, or recording music, now we can all be miniature professionals by selling our work or services on the web. (Birders have mostly kept out of the market, though.) And when these ventures perform poorly—as most inevitably will—a tinge of disappointment and failure hangs over what, in another time, might have been a perfectly carefree pursuit. In other words, we now have the ability to turn virtually any skill we have into another job—which is not exactly a recipe for joy. 

Of course, Schopenhauer was not talking about hobbies. With a good deal of self-pity, he was referring to his own largely unrewarded and unrecognized labor to create a new system of philosophy. That bitter man was certainly not the only genius whose work was ignored by his contemporaries. There are too many to name. In retrospect, it is a wonder that people can be so blind. And yet, the idea that posterity is the ultimate judge—which Schopenhauer would likely agree with, I think—is just another version of the idea that markets are the ultimate judge of value. In this case, you can just say that the market is a little bit slow.

But, as I mentioned in my review of Van Gogh’s letters, this introduces a kind of paradox. For if the market is the arbiter of value, and that market can be tardy in coming to a verdict, then we must labor under the uncertainty of our own worthiness. We can spend our lives painting and leave behind a treasure for the ages, or we can spend our lives painting and leave behind junk nobody wants. Since we might die before our work is “discovered,” we might never know. Herman Melville, for example, could probably never have dreamed that Moby Dick—which sold poorly and got mediocre reviews—would become the Great American Novel. 

Are there any lessons to be drawn from this? Maybe the very idea that markets—including posthumous markets—determine value ought to be scrapped. After all, there is very little stability or unanimity in mass opinion. For all we know, in 100 years Van Gogh might not even be popular or beloved anymore. Schopenhauer’s reputation has certainly had its highs and lows.

Quotes & Commentary #79: Tolkein

Quotes & Commentary #79: Tolkein

Many that live deserve death. And some die that deserve life. Can you give it to them? Then do not be so eager to deal out death in judgment. For even the very wise cannot see all ends.

J.R.R. Tolkein

I find myself revisiting this long-defunct section of my blog in response to the news of Kenneth Smith’s execution, which took place on January 24th of this year. Smith was condemned for the 1988 murder of Elizabeth Sennett. He had been hired through an intermediary (who received life in prison) at the behest of Sennett’s husband, Charles (who killed himself once he learned that he was suspected). Smith committed the murder—brutally beating and stabbing Sennett to death—along with another man, John Forrest Parker, who was executed in 2010.

Smith was first scheduled to be executed via lethal injection in 2022, but the execution was botched—the third one in a row in the state of Alabama. After over an hour of trying, the execution team quit after they failed to properly place the IVs in Smith’s veins.

This is why, when Smith’s execution was rescheduled, it was decided to carry out the grim task using a novel method: nitrogen asphyxiation. During this procedure, the victim is strapped down to a gurney and fitted with a mask, which forces him to breathe in nitrogen until death occurs. It was the first execution of this kind performed in the United States—perhaps in history. And while the Alabama Attorney General insisted that the execution was “textbook,” and predicted that “many states will follow,” witnesses described Smith writhing and gasping for a number of minutes before finally succumbing.

When I read about this execution, I felt an acute sense of horror and disgust. In my moments of optimism, I like to imagine that, as the years go by, our ethical standards are becoming ever-more elevated. It is thus acutely depressing to hear that, in 2024, we are still fumbling for ways to kill our prisoners—and that asphyxiation is being regarded as, somehow, innovative and humane.

In my view, punishments can only be justified on a limited number of grounds. It is justifiable, for example, to isolate somebody who has proven dangerous to others. And legal consequences are warranted if they serve as deterrents for other potential criminals.

Yet imprisonment isolates a prisoner just as effectively as execution, while study after study has shown that the death penalty does not, in fact, deter potential criminals.

All this seems rather pedantic to say, as it is quite obvious that capital punishment is not a policy born of logic. Rather, it only exists to satisfy a primitive urge for vengeance. It is Old Testament wrath, and not New Testament mercy.

Now, anybody can certainly understand the urge to get back at someone. And perhaps executions can provide closure for the family and friends left behind by a murder. However, vengeance is not, and cannot be, justice. Indeed, our institutions of justice have been created precisely to supplant the basic law of an eye for an eye. And even if—at least in some parts of America—capital punishment is widely popular, and even if it provides some sort of consolation to some, the death penalty is impossible to justify according to any ethical framework I am familiar with.

It may be true, as Tolkein said, that there are some who “deserve death.” However, I find it disturbingly hubristic to think that any human institution, however admirable its ideals, is wise enough to mete it out. Kenneth Smith certainly deserved punishment. But I cannot see how asphyxiating him has made the world a better place.

Quotes & Commentery #78: Basho

Quotes & Commentery #78: Basho

It was with awe

That I beheld

Fresh leaves, green leaves

Bright in the sun

—Matsuo Basho

Last year, during the early months of the pandemic, I took up writing these little essays once again. But it was not exactly in good faith—that is, I did not do it in the original spirit of the Quotes & Commentary, as an exploration of my own beliefs. Instead, as has happened to me before, the essays became a vessel to comment upon current affairs, which of course meant the COVID-19 pandemic. I had an awful lot to say about things I know very little about. So now, for a change, I will focus my attention closer to home.

As it happens, I am as close to home as it is possible to be right now, since I am visiting Sleepy Hollow for the summer. There are, of course, a million things I enjoy about being here. Family, friends, and food handily win gold, silver, and bronze, respectively. But over the years, I have come to realize how much I long for the natural environment of my native place—the climate, flora, and fauna of the Hudson Valley. 

Madrid has its beauties, especially in the mountains. Indeed, the Hudson Valley is, by comparison, flat and undramatic. The atmosphere, too, is rather cloudy and thick here compared with the crystalline clarity of Spain. Even if you do find a sufficiently high place, the view can be obscured by the humidity.

But what my hometown has in abundance are fresh leaves, green leaves. It is just so verdant here that, compared with arid Castille, it can seem like a tropical rainforest. Trees—many over one hundred feet tall—cover the landscape, some of them in turn covered with climbing vines. In Madrid, if you want to visit anything remotely approximating this, you have to make a reservation weeks in advance and then drive to the Hayedo de Montejo de la Sierra, a beech forest occupying a microclimate in a mountain valley, where you will be given an hour-long guided tour. It’s just not the same.

Here, by contrast, I have Rockefeller State Park right behind my house. I can go anytime I want, for as long as I want; and that means every day I can. Walking, hiking, or running is obviously good for your body. Research has shown that spending time in nature has positive psychological effects, too. Indeed, “forest bathing”—a kind of tree-based therapy—became something of a fad in Spain a few months ago. It is taken seriously in Japan and Korea. I have no idea whether a walk in the woods can help with severe depression, anxiety, or trauma. But I am quite sure that it can put you in a better mood, help calm you down, or make you think more clearly.

Part of it, I think, is the sensory richness of natural environments. A forest is visually more complicated than most urban landscapes. It is not organized using perfectly straight lines (something seldom found in nature), in neat and orderly rows. Living things are shaped by natural selection, while the land itself is shaped by the geological processes, neither of which result in anything like a suburb or a cityscape. Nevertheless, it is not random or chaotic. Rather, natural landscapes are organized more subtly, on scales of time and distance that are not necessarily perceptible by us.

Forests are also rich in every other sensation, too, though admittedly I don’t spend much time touching and tasting. Perhaps I should. There are wild blackberries and blueberries in this area. But there is also poison ivy and ticks carrying lyme disease, so I tend to stay on the gravel paths. Still, my nose keeps quite active, drawing in all the various fragrances—cut grass, sheep dung, flowers, compost, and most of all fresh air, untainted (for the most part) with exhaust. And I am not inclined to take air for granted these days, since a couple of weeks ago the smoke from a massive fire in Oregon drifted over and turned everything grey, rendering the air harsh and unwholesome. I had a cough for weeks.

Yet it is the sounds that I cherish second only to the sights. The forest is sonically active, especially in summer. Cicadas scream from the treetops, while crickets sing from the undergrowth, and birds of all variety are calling all around. Just today I was lucky enough to see a hawk on a nearby branch, uttering its piercing cry. Madrid, by comparison, is as quiet as a church. The sensation of being surrounded by this chorus is intensely soothing. It is like being submerged in a cool bath. Matsuo Basho appreciated this, too:

In the utter silence

Of a temple,

A cicada’s voice alone

Penetrates the rocks.

Quotes & Commentary #77: Camus

Quotes & Commentary #77: Camus

Really, however, it is doubtful if this could be called a victory. All that could be said was that the disease seemed to be leaving as unaccountably as it had come. Our strategy had not changed, but whereas yesterday it had obviously failed, today it seemed triumphant.

—Albert Camus

We humans are vulnerable to a variety of cognitive illusions, not the least of which is the illusion of control. The idea that an event is completely out of our control is extremely difficult for us to accept, apparently; and so our brain tricks us into thinking that we are the ones pushing the buttons. This can take many benign and amusing forms. For example, many of us repeatedly push the call elevator button or the crosswalk signal while waiting, with the idea that we can somehow speed it up. Or we leave the pit of an avocado in some guacamole, thinking we can prevent it from going bad. 

This behavior often leads to superstitions, especially in situations when chance plays a major role. For example, baseball is notorious for the great many superstitions which abound, as players recruit supernatural intervention to reduce the role of chance. Fundamentally, these superstitions all make the mistake of confusing correlation for causation. So if a batter eats sixteen carrots and then hits a home run, he may conclude that the home run was due to the carrots.

And this process can take place on a societal scale. The classic example is, perhaps, the rain dance—an attempt to control weather patterns through ritual. Indeed, the idea that humans can influence the natural order through carefully prescripted and repeated gestures is arguably one of the psychological roots of religion. 

The reason that I am bringing all of this up is that I believe we can observe this process quite clearly in our response to the coronavirus. All of us badly want to feel as though we can control the spread of the virus, and this has led to some sensible and, I suspect, some far less sensible solutions. I have observed several people in my neighborhood who put little bags on their dogs’ feet. Only slightly less ridiculous are the shoe disinfectant mats being sold online. Even the practice of wiping down our groceries with bleach strikes me as more ritualistic than sensible. 

Indeed, considering that we can get the virus just from breathing in particles, then all this trouble to disinfect surfaces does seem rather suspect to me. I cannot help thinking that, by the time you touch an infected surface, you will have breathed in the virus ten times before. (And by the time you get it from your dog’s paws, you will have gotten it one hundred times before.)

Just as in superstition, irrational virus precautions can take place on a societal as well as an individual scale. The most notorious example of this I have seen was the bleaching of a Spanish beach, in the coastal town of Zahara de los Atunes. While undoubtedly causing significant environmental damage, the benefits to coronavirus control seem doubtful in the extreme. As another doubtful measure, I would offer Governor Cuomo’s decision to disinfect New York City’s subway system every night. Again, if the virus can be breathed in, then the threat from contaminated surfaces may be entirely redundant.

More generally, I think it is fair to say that we do not completely understand the pattern of coronavirus spread. A few days after announcing the nightly subway cleaning—a massive and expensive effort, which displaces the homeless and may impede some people’s commutes—Cuomo announced the results of a study on 600 people who were diagnosed with the virus in a hospital. He was surprised to find that only 4% had used public transportation. The large majority were not working. This result is puzzling. If mere exposure to the virus was enough, then one would expect the essential workers—especially those on public transport—to constitute a far larger portion of cases, since they come into contact with far more people.

Perhaps we have overestimated the importance of mere exposure, then, and underestimated the importance of “viral dose.” (Please keep in mind that I am in no way an expert, and this is pure speculation on my part!) This means that a long amount of time spent with one infected person could matter more than a passing proximity with several. If this is the case, then forcing people to stay in their homes, even if they have symptoms (which was the policy here in Spain), may be somewhat counterproductive, since it would increase the viral dosage of any co-residents.

This would also mean that prohibitions on outdoor exercise were not sensible. Indeed, over two weeks after finally letting children go outside in Spain, no noticeable uptick has been observed (despite complaints that people were not maintaining the correct distance). Other evidence points in this direction as well. This Chinese study could only find one single case of an outdoor transmission, and instead found that the vast majority of outbreaks took place inside the home.

Globally, the data also seems to indicate that we do not fully understand the relevant variables. The virus seems to be striking some countries hard while leaving others mostly untouched, in a pattern that is not easily explained either with governmental action or weather. The case of Spain and Portugal seems especially baffling, as Spain’s small neighbor has so far suffered five times fewer fatalities per habitant as Spain. And this, in spite of never having imposed mandatory stay-at-home orders or closing all non-essential businesses. Though Portugal is given credit for acting early, the two countries entered into a state of alarm at about the same time, closing schools and restaurants the same week. Yet the contrast is striking. 

If we are going to effectively combat this virus, then I think this means doing our best to resist the illusion of control. This is because the cognitive illusion blinds us to the real effectiveness of our strategies. If we embark on a maximal strategy—doing everything we can think of to stop the virus—and the virus indeed abates, we may conclude that it takes a maximal strategy to beat the virus. But in that case, we may end up like the carrot-eating batter, drawing false conclusions from a mere correlation. And since so many individual measures are rolled into a maximal strategy, we remain in the dark as to which specific measures are the most helpful, which are basically useless, and which are counterproductive.

This information is vital if we are to achieve anything resembling a functional economy. Our goal should be to uncover which measures have the lowest cost-benefit ratio—inexpensive and minimally inconvenient strategies which effectively curb the virus. If indeed masks work, then widespread mask usage would be such a strategy, since they do not significantly disrupt normal life and cost mere pennies to produce. If it is not too late, increased security measures for senior care homes would be another such strategy, since age is a major risk factor.

Perhaps the easiest way to determine such measures would be surveys. Governor Cuomo has already demonstrated the knowledge that can be gained by surveying incoming hospital patients. Indeed, we probably should have been doing so from the beginning, allowing a more detailed picture to emerge of which activities tend to increase risk. Widespread serological testing for antibody prevalence can also be easily supplemented with detailed surveys. With any luck, certain patterns will emerge from this data, which will point us in the right direction.

Another way to find out more about how and where the virus spreads would be to turn our testing capacity away from patient diagnosis and towards investigative studies. This would mean testing representative samples from relevant populations, to ascertain the prevalence of the virus in different areas and professions. Such testing may reveal useful patterns in the virus’s spread. Contact tracing—once we have the ability to do so—can be similarly used as an investigative tool.

But as it stands now, I often get the impression that officials (here in Spain at least) are like a blindfolded boxer, swinging left and right hopping to connect with the target. The result is rather incoherent. For example, when people were finally allowed outside to walk and run, the officials decided to impose time constraints for these activities. I am not sure what they hoped to gain from this. But the result has been that everyone rushes out the door as soon as the clock strikes, and the streets are consequently packed.

Adding to this, officials decided not to open the parks, so there is less space for walking. To compensate, they tried converting several roads in the city into pedestrian zones. But I cannot help wondering: how is a pedestrian zone any safer than a park? Last weekend we were treated to the absurd spectacle of joggers squeezed into a narrow, tapped-off zone, jogging in one big circle around Madrid’s Retiro park, which remained closed. 

Such policy mistakes are harmless enough, I suppose. But I think we need to be very wary of what this blind swinging can lead to. Traumatic events can provoke a panicked response that can do more damage than the threat we are trying to avoid. America’s last traumatic event—the September 11th attacks—provoked some very sensible changes, like increased airport security, but also set off a series of interventionist wars that cost far more lives than the original attacks themselves. Such wars seem rather absurd to many of us now; but at the time, when the threat of terrorism seemed to overshadow every other consideration, we were willing to react with a maximal strategy.

Does this crisis present us with a similar danger? I think it may. And if so, we need to do our best to avoid the coronavirus equivalent of an Iraq War, and focus on finding strategies equivalent to bomb screenings and reinforced cockpit doors—easy, cheap interventions that can save lives, rather than a giant quagmire that only adds another problem on top of the one we already have. If we are the blindfolded boxer, we need to focus on removing the blindfold, rather than swinging as hard as we can.

Quotes & Commentary #76: Thucydides

Quotes & Commentary #76: Thucydides

Reckless audacity came to be considered the courage of a loyal ally; prudent hesitation, specious cowardice; moderation was held to be a cloak for unmanliness; ability to see all sides of a question inaptness to act on any. The advocate of extreme measures was always trustworthy; his opponent a man to be suspected.

—Thucydides

In my previous post I bemoaned the conversion of a public health crisis into yet another partisan fight—with those on the left for the lockdown, and those on the right against it. In this regard, I think it is striking to reread this passage of Thucydides, as it encapsulates a common occurrence in times of crisis: the preference for extreme measures over moderation, for decisiveness over prudent hesitation.

The reason for this is our very human need to feel safe and secure. Having a plan, especially a drastic plan, is one of the ways we accomplish this. Carrying out extreme measures at least gives us the illusion of control; and control is what everyone craves in an emergency. But I do not think we should let this very human need prevent us from being critical, open-minded, and moderate. These are good qualities in the worst of times as well as in the best of times.

My main concern is that I think that too many people—especially on the left—are advocating long, strict lockdowns as the only possible option. Calls to reopen are being dismissed as irresponsible or even nefarious, and respected epidemiologists like David Katz (who advocates more measured policies) can only get a hearing on Fox News and Bill Maher’s show. This makes me worry that the left is backing itself into an ideological corner, insisting that lockdowns are the only way to fight this virus.

There is certainly a noble impulse in this: valuing human life over profit. But I think that the situation is far more complex than this dichotomy implies. A narrative is starting to emerge that it is our evil corporate overlords (Elon Musk, most notoriously) who want us to return to work in order to satisfy their greed. Already in Georgia, people are compiling lists of businesses which are reopening with the intention of blacklisting them for doing so. But are we really willing to vilify people for reopening when remaining closed would mean bankruptcy, financial ruin, and losing their livelihoods? The anti-corporate, pro-lockdown messaging is ignoring the simple truth that the economic effects of the lockdowns will hurt the poor far, far more than they will hurt the rich. 

Granted, we could and should be doing much more to help the poor and disadvantaged during this time. Also granted, our economy was rife with structural inequalities before all of this, which ought not to have been there to begin with. But we must work with the economy we have and with the options that are politically possible—not with the economy we should’ve had and the things we should be able to do. And we also must be sure that our policies are shaped by prudence rather than fear or ideology.

So here is my worry: if the left (of which I consider myself a member) becomes the party of lockdowns, this may not appear so wise in retrospect. This is because the efficacy of our anti-virus measures is still very much an open question; and it is thus very possible that some of our policies will have done more harm than good.

As a prime contender for this, I would submit school closures. As I noted in my previous post, young children seem both safe from, and hardly able to transmit, the virus. The idea that they were major transmitters was an educated guess, and it seems to have been wrong. Keeping children out of school, however, will undoubtedly be harmful to their development and detrimental to their futures. And it will most certainly do the most harm to the poorest among us. Furthermore, keeping children home puts more pressure on parents, and may take some doctors and nurses out of commission.

At the very least, I think it is wrong to close schools in a “better safe than sorry” mentality, without very thorough consideration of the costs and benefits. As a teacher myself, I can say with confidence that virtual learning is no substitute for being in the classroom. If my students must miss class, I want to be sure that it is to protect them, and not simply to make us feel safer. I am not willing to sacrifice their education to satisfy my panic.

Here is the trouble with a total lockdown: it combines so many different measures into one sweeping global approach that we have no opportunity to see which specific parts of the lockdown—closing restaurants, canceling concerts, calling off school—have the highest cost-benefit ratio. It simply cannot be taken for granted that a total lockdown is the single best strategy going forward. In the absence of more data about the virus’s lethality and total spread, we cannot even be confident that it was even a wise strategy to begin with. (A study by the Wall Street Journal—which admittedly has its own biases—found that there was no correlation at all between coronavirus mortality and the speed of lockdown in U.S. states.)

The case of Sweden should give lockdown advocates pause. Sweden has become notorious for its lax coronavirus measures. Shops and restaurants are open, and life carries on without masks or gloves. Meanwhile, most other European countries instituted strict lockdowns. Spain had one of the strictest lockdowns of all. Parks were closed, and people were not allowed to go on walks or to exercise outside. All non-essential businesses were shuttered, and people could only leave the house to go to the pharmacy and the supermarket. Police patrolled the streets, giving out hundreds of thousands of fines, and making hundreds of arrests, in enforcement of the lockdown.

If lockdowns were really an effective way of controlling the virus, then one might expect Spain to have a substantially lower death rate. On the contrary, Spain has suffered twice as many deaths-per-million as Sweden. Indeed, Sweden is in the ballpark of Ireland and Switzerland, two countries that took swift, decisive action to shut down their economies. And Sweden’s “curve” seems to be leveling out anyway. To say the very least, it has not been an unmitigated disaster in the country. 

Admittedly, if you compare Sweden to its Scandinavian neighbors, Finland and Norway, you can see that their lax policy seems to have resulted in a higher mortality rate. Does this prove that Sweden has taken the wrong course? I think we should not rush to judgment. First, it is easily possible that, as Finland and Norway open up, their death rates will climb to approach Sweden’s. Furthermore, by minimizing the damage done to their economy, there is a very real possibility that Sweden inflicted less total harm on its society.

(We also should not rush to declare New Zealand’s tough policies a success, which for the moment seems to have eliminated coronavirus from their shores. While this is impressive, it remains to be seen whether this was the best strategy for the long-term, since it is possible that it will only make it that much more difficult to reestablish open channels with the outside world.)

My own personal fear—which apparently is not shared by many—is that the left will put itself in a bad position if it becomes the party of the lockdown. At the present moment, there is an awful lot of fear of this new virus. But in six months, when the elections roll around, what will be at the forefront of people’s minds: the virus, or the economic depression?

My guess is that, as time goes by, fear of the virus will fade, and concern for ruined businesses, blasted retirement accounts, and lost careers will only grow more acute. So far, it seems that Republicans have shifted most decisively in the direction of economic concern, with Independents shifting somewhat in that direction, while Democrats have hardly budged.

Such flagrantly political concerns should not guide our policy. Concern for human welfare should. And I am afraid that we may be developing myopic and unrealistic ideas about the lockdowns in this regard. First, somewhere along the line, many people seemed to have forgotten that our original idea of “flattening the curve” was to prevent the healthcare system from being overwhelmed. The idea was never that we would absolutely prevent people from getting sick. Unless we are willing to stay inside until a vaccine is widely available—an unknown timeline, but still many months away—we are simply going to have to accept some risk from the virus.

Now, perhaps some rich countries could afford to stay shut up indoors until we have a vaccine. And maybe this would benefit these rich countries (though I doubt it). However, I think such a prolonged and severe period of economic inactivity would be horrendous for poorer countries. Telling people to stay inside is simply not feasible where people live in shacks and have no savings. And governments in poor countries could not afford drastic social policies to keep their people fed, especially during a severe depression. (Remember that a depression in richer countries means a depression everywhere.) A months-long lockdown could easily result in food shortages in many parts of the world, which might claim significantly more lives than the virus itself. 

What is more, prolonged economic depression has serious political repercussions. Economic instability easily translates into political instability, and political instability easily translates into violence—even war. The 2008 financial crisis has already had an awful effect on worldwide politics, eventually resulting in waves of populist right-wing parties, and a growing polarization which has resulted in increasingly dysfunctional governments. And this is only to speak of rich countries.

In countries already struggling with low standards of living and ineffective governments, what will be the results of an economic crisis much more severe than 2008? Will every government be able to take the pressure? We must keep in mind that, if any government fails, the consequences will be bad for everyone. As we have learned, power vacuums leave the door open for the most dangerous among us to gain control.

For those of us on the left, I think it behooves us to examine the complete picture, and not to fall into easy rhetoric about workers being sacrificed for the economy. These are the hard facts: the virus is here to stay; and if the economy is not working, it will be very, very hard on millions of people—especially poor people all around the world. Our governments could and should do more to alleviate the economic suffering. But many countries around the world simply do not have the resources to do so, and a severe depression will only make this more true. Of all people, we on the left should know that poverty hurts and kills, and we cannot afford to turn this into yet another purity test.

The hardest truth of all, perhaps, is that we are in a horrible situation that requires us to make painful compromises. An ideology that promises easy answers and readily-identifiable villains will not get us very far.

Quotes and Commentary #75: Franklin

Quotes and Commentary #75: Franklin

So convenient a thing is it to be a reasonable creature, since it enables one to find or make a reason for everything one has a mind to do.

—Benjamin Franklin

Our brain has an astounding ability to formulate justifications for believing things that we want to believe. As I have already remarked, the current pandemic—a major historical event—seems to have changed nobody’s mind. Socialists are calling for universal healthcare, capitalists are calling for corporate bailouts, and in general the old battle political lines are as strong as ever.

Indeed, at a time one might expect people to turn towards experts, I have observed some even embracing—of all things!—anti-vaccination. Really, if a pandemic cannot convince somebody that viruses really do cause disease and that we ought to prepare for them, then I do not see much hope for reasoned debate.

Considering that our collective response to a novel virus has somehow turned into a partisan fight, then I do think there is ample reason to suspect that many of us are not reacting rationally. And there are powerful emotional reasons for this. Specifically, I think that we have been impaled on the twins horns of fear and foolhardiness.

A part of our brains is terrified of contagion and disease, while another part is prone to wishful thinking. And it seems that these two emotional reactions are coming to dominate the two sides of the political spectrum: the left is fearful, while the right is foolhardy. The problem is that I think both emotions lead to irrational and unsustainable responses.

Giving into fear means embracing long and even indefinite lockdowns. Better to be safe at home than putting oneself and others outside at risk. And of course this is not wholly unreasonable: the coronavirus is real, and it has taken lives. My concern is that the original justification for the lockdowns is being forgotten. The idea of “flattening the curve” was to prevent our healthcare systems from being overwhelmed, which was a real danger in places like Madrid and New York City. The idea was that, if we flatten the curve, we will save lives—not by reducing the total number of infections, but by spreading out the infections so our healthcare system could handle them.

But somehow this thinking got lost, and I am afraid that there are many who think we can somehow eliminate the virus altogether by staying inside. The virus will be there, waiting, whenever we leave our homes. A virus has no timeline and infinite patience. So unless we are willing to wait until a vaccine has been invented, mass-produced, and widely distributed—which will be such a long time that it could cause unprecedented economic harm—then the only reason to wait, as far as I can see, is to make sure our hospitals will be able to handle the influx of patients. In places where the healthcare system is not nearly at maximum capacity, I do not see what would be gained by an extra month of waiting inside, other than allowing people to feel safe a little longer.

Demanding long, strict lockdowns even in areas where the virus is not widespread strikes me as a response motivated primarily by fear. And the problem with fear is that it gives you tunnel-vision, focusing all your attention on the source of danger, and reorienting all of your priorities around the new threat. This can lead to some obviously irrational behavior—in nearly all of us.

For example, a person may refuse to walk across a bridge because of his fear of heights, but may smoke cigarettes and drink heavily (much more serious risk factors for health) on a regular basis. Collectively we may focus frantic attention on a mining disaster that kills dozens—launching inquiries and investigations and instituting reforms—while we completely ignore hospital-acquired infections, which kill tens of thousands year after year. The human brain is wired to respond to immediate threats, and especially the kinds of threats (like violence and disease) that shaped our evolution. Much more serious threats, which kill through slower and less spectacular means, are easier to ignore. (Global warming is probably the best example of this.) 

The problem with our great fear of the virus is that I think it may cause us to neglect the costs of our containment measures. The success of a country’s policies cannot be measured in the number of its coronavirus cases and deaths alone. A profound and prolonged economic depression will both reduce lifespans and, directly and indirectly, also cause deaths. But the suffering caused by the depression will be slower, longer, less spectacular, and thus less scary. However, such damage is just as real, and it is just as much a consequence of our policies.

As I have argued before, a truly moral response to the crisis requires that we try to reduce harm as much as possible, all across the board. Fearfully focusing our attention exclusively on the coronavirus will lead us to be not only irrational, then, but potentially immoral.

Let me give a concrete example of this. From what I can tell, there is a growing amount of evidence that young children are in negligible danger from the virus. Moreover, according to this study performed in Australia, children are apparently not even a significant source of contagion. Indeed, in Switzerland they are apparently confident enough that children do not pose a danger that they have given young kids permission to hug their grandparents.

If it is true that children neither pose nor suffer a significant risk, then that would make many of our school closures questionable at best. In Spain, for example, schools will not reopen until the fall. But keeping kids out of school does serious, lasting harm to them—harm that disproportionately falls on poorer students, and harm that must also be taken into account when we make policies. (Also, Spain’s policy of keeping children confined in their homes for 6 weeks also seems hard to justify in this light.)

My point is not to advocate for school reopenings, per se (though it does seem reasonable to me), but to make the more general point that the current environment of fear makes even a rational discussion about school closures impossible. Instead of balancing the risks and rewards to students, many on the left seem to instinctively dismiss such conversations as frivolous and irresponsible. I do not think this is a productive mindset.

The mirror image of fear is foolhardiness, and I can sense it taking hold in certain sections of the right. We see it most clearly in the demonstrations in front of state capitals, in which participants deliberately flout safety guidelines in order to demand an economic reopening, often for pretty superficial reasons (like a haircut). It seems that many people still believe that the virus is not a significant threat and does not require any special action. But judging from what happened both here in Madrid and in my own state of New York, I think there is ample evidence that this virus is, minimally, a significantly greater threat than the seasonal flu. A policy of “do-nothing” is clearly inadequate when bodies are outstripping coffin-production and morgue capacity.

Foolhardiness also leads to another kind of cognitive blindness: wishful thinking. We all have the tendency to grasp at any data which supports our hoped-for conclusions. We can see this at work in a viral video of a certain Dr. Daniel Erickson (since removed from YouTube), who claimed that the data showed that coronavirus was, in fact, already widespread and far less deadly than the flu. But his argument was a product of wishful thinking, and rested on a basic error of statistics—extrapolating from a non-representative sample.

Erickson used our current coronavirus test records to extrapolate to the entire population. This does not take into account the obvious fact that most testing is done on people who are sick or who otherwise suspect they may have the virus. Our testing records, then, will obviously show much higher rates of coronavirus than the general population. Extrapolating from this data is simply nonsense. However, this video gained traction in rightwing circles and was even used on Fox News to argue against the lockdown.

A rational and moral response will be neither fearful nor foolhardy, but will take measures to minimize the total harm to society—both from the virus and the economic downturn. This is easier said than done, of course, especially in the world of politics. Personally, I think this is the ideal time to try out something like Universal Basic Income, which I think would ease the economic pressure and also give us more flexibility in combating the virus. But sadly this does not seem likely.

Given the real possibilities, then, I think that it is our obligation to cut the best path we can through both fear and foolhardiness, balancing the risk posed by the virus against the risk posed by a major economic depression. This means that we cannot let our fear of the virus be the only factor we consider; but we also cannot let wishful thinking cloud our judgment.

Acting rationally means fighting against the universal human tendency to give in to our hopes and fears. Both hope and fear, in different ways, distort the real danger.

Quotes & Commentary #74: Kahneman

Quotes & Commentary #74: Kahneman

We are prone to overestimate how much we understand about the world and underestimate the role of chance.

—Daniel Kahneman

Kahneman’s book, Thinking, Fast and Slow is one of the most subtly disturbing books that I have ever read. This is due to Kahneman’s ability to undermine our delusions. The book is one long demonstration that we are not nearly as clever as we think we are. Not only that, but the architecture of our brains makes us blind to our own blindness. We commit systematic cognitive errors repeatedly without ever suspecting that we do so. We make a travesty of rationality while considering ourselves the most reasonable of creatures.

As Kahneman repeatedly demonstrates, we are particularly bad when it comes to statistical information: distributions, tendencies, randomness. Our brains seem unable to come to terms with chance. Kahneman gives the example of the “hot hand” illusion in basketball—when we believe a player is more likely to make the next shot after making the last one—as an example of our insistence on projecting tendencies onto random sets of data. Another example is from history. Looking at the bombed-out areas of their city, Londers began to suspect that the German Luftwaffe was deliberately sparing certain sections of the city for some unknown reason. However, mathematical analysis revealed that the bombing pattern was consistent with randomness.

The fundamental error we humans commit is a refusal to see chance. Chance, for our brains, is not an explanation at all, but rather something to be explained. We want to see a cause—a reason why something is the way it is. We do this automatically. When we see videos of animals, we automatically attribute to them human emotions and motivations. Even when we talk about inanimate things like genes and computers, we cannot help using causal language. We say genes “want” to reproduce, or a computer “is trying” to do such and such. 

A straightforward consequence of this tendency is our tendency to see ourselves as “in control” of random outcomes. The best example of this may be the humble lottery ticket. Even though the chance of winning the lottery is necessarily equal for any given number, people are more likely to buy a ticket when they can pick their own number. This is because of the illusion that some numbers are “luckier” than others, or that there is a skill involved in choosing a number. This is called the ‘illusion of control,’ and it is pervasive. Just as we humans cannot help seeing events as causally connected, we also crave a sense of control: we want to be that cause.

The illusion of control is well-demonstrated, and is likely one of the psychological underpinnings of religious and superstitious behavior. When we are faced with an outcome largely out of our control, we grasp at straws. This is famously true in baseball, especially with batters, who are notoriously prone to superstition. When even the greatest possible skill cannot guarantee regular outcomes, we supplement skill with “luck”—lucky clothes, lucky foods, lucky routines, and so on.

The origins of our belief in gods may have something to do with this search for control. We tend to see natural events like droughts and plagues as impregnated with meaning, as if they were built by conscious creatures like us. And as our ancestors strove to influence the natural world with ritual, we imagined ourselves as causes, too—as able to control, to some extent, the wrath of the gods by appeasing them.

As with all cognitive illusions, these notions are insulated from negative evidence. Disproving them is all but impossible. Once you are convinced that you are in control of an event, any (random) success will reinforce your idea, and any (random) failure can be attributed to some slight mistake on your part. The logic is thus self-reinforcing. If, for example, you believe that eating carrots will guarantee you bat a home run, then any failure to bat a home run can be attributed to not having eaten just the right amount of carrots, at the right time, and so on. This sort of logic is so nefarious because, once you begin to think along these lines, experience cannot provide the route out.

I bring up this illusion because I cannot help seeing instances of it in our response to the coronavirus. In the face of danger and uncertainty, everyone naturally wants a sense of control. We ask: What can I do to make myself safe? Solutions take the form of strangely specific directives: stay six feet apart, wash your hands for twenty seconds, sneeze into your elbow. And many people are going further than the health authorities are advising—wearing masks and gloves even when they are not sick, disinfecting all their groceries, obsessively cleaning clothes, and so on. I even saw a man the other day who put little rubber bags on his dog’s paws, presumably so that the dog would not track coronavirus back into the house. 

Now, do not get me wrong: I think we should follow the advice of the relevant specialists. But there does seem to be some uncertainty in the solutions, which does not inspire confidence. For example, here in Europe we are being told to stand one meter apart, while in the United States the distance is six feet—nearly twice as far. Here in Spain I have seen recommendations for handwashing from between 40 and 60 seconds, while in the United States it is 20 seconds. It is difficult to resist the conclusion that these numbers are arbitrary. 

If Michael Osterholm is to be believed—and he is one of the United States’ top experts on infectious disease—then many of these measures are not based on hard evidence. According to him, it is quite possible that the virus spreads more than six feet in the air. And he doubts that all of our disinfecting has much effect on the virus’s spread, as he thinks that it is primarily not through surface contact but through breathing that we catch it. Keep in mind that, a week or so ago, we were told that we could stop it through handwashing and avoiding touching our own faces.

Telling people that they are powerless is not, however, a very inspiring message. Perhaps there is good psychology in advocating certain rituals, even if they are not particularly effective, since it can aid compliance in other, more effective, measures like social distancing. Rituals do serve a purpose, both psychological and social. Rituals help to focus people’s attention, to reassure them, and to increase social cohesion. These are not negligible benefits. 

So far, I think that the authorities have only been partially effective in their messaging to the public. They have been particularly bad when it comes to masks. This is because the public was told two contradictory messages: Masks are useless, and doctors and nurses need them. I think people caught on to this dissonance, and thus continued to buy and wear masks in large numbers. Meanwhile, the truth seems to be that masks, even surgical masks, are better than nothing (though Osterholm is very skeptical of that). Thus, if the public were told this truth—that masks might help a little bit, but since we do not have enough of them we ought to let healthcare workers use them—perhaps there would be less hoarding. 

Another failure on the mask front has been due to bad psychology. People were told only to wear masks when they were sick. However, if we follow this measure, masks will become a mark of infection, and will instantly turn wearers into pariahs. (What is more, many people are infectious when they do not know it.) In this case, ritualistic use of masks may be wise, since it will eliminate the shame while perhaps marginally reducing infection rates.

The wisest course, then, may indeed involve a bit of ritual, at least for the time being. In the absence of conclusive evidence for many of these measures, it is likely the best that we can do. I will certainly abide by what the health authorities instruct me to do. But the lessons of psychology do cause a little pinprick in my brain, as I repeatedly wonder if we are just grasping at a sense of control in a situation that is for the most part completely beyond our means to control it.

I certainly crave a sense of control. Though I have not been obsessively disinfecting everything in my house, I have been obsessively reading about the virus, hoping that knowledge will give me some sort of power. So far it has not, and I suspect this is not going to change.