Everything has a hundred parts and a hundred faces: I take one of them and sometimes just touch it with the tip of my tongue or my fingertips, and sometimes I pinch it to the bone. I jab into it, not as wide but as deep as I can; and I often prefer to catch it from some unusual angle.

—Michel de Montaigne

The pursuit of knowledge has this paradoxical quality: it demands perfection and yet continuously, inevitably, and endlessly fails in its goal.

Knowledge demands perfection because it is meant to be true, and truth is either perfect or nonexistent—or so we like to assume.

Normally, we think about truth like this: I make a statement, like “the cat is on the mat,” and this statement corresponds to something in reality—a real cat on a real mat. This correspondence must be perfect to be valid; whether the cat is standing on the side of the mat, or if the cat is up a tree, then the statement is equally false.

To formulate true statements—about the cosmos, about life, about humanity—this is the goal of scholarship. But can scholarship end? Can we get to a point at which we know everything and we can stop performing experiments and doing research? Can we reach the whole truth?

This would require scholars to create works that were both definitive—unquestioned in their authority—and exhaustive—covering the entire subject. What would this entail? Imagine a scholar writing about the Italian Renaissance, for example, who wants to write the perfect work, the book that totally and completely encapsulates its subject, rendering all additional work unnecessary.

This seems as if it should be theoretically possible, at least. The Italian Renaissance was a sequence of events—individuals born, paintings painted, sculptures sculpted, trips to the toilet, accidental deaths, broken hearts, drunken brawls, late-night conversations, outbreaks of the plague, political turmoil, marriage squabbles, and everything else, great and small, that occurred within a specific period of time and space. If our theoretical historian could write down each of these events, tracing their causation, allotting each its proportional space, neutrally treating each fact, then perhaps the subject could be definitively exhausted.

There are many obvious problems with this, of course. For one, we don’t have all the facts available, but only a highly selective, imperfect, and tentative record, a mere sliver of a fraction of the necessary evidence. Another is that, even if we did have all the facts, a work of this kind would be enormously long—in fact, as long as the Italian Renaissance itself. This alone makes the undertaking impossible. But this is also not what scholars are after.

A book that represented each fact neutrally, in chronological sequence, would not be an explanation, but a chronicle; it would recapitulate reality rather than probe beneath the surface; or rather, it would render all probing superfluous by representing the subject perfectly. It would be a mirror of reality rather than search for its fundamental form.

And yet our brains are not, and can never be, impartial mirrors of reality. We sift, sort, prod, search for regularities, test our assumptions, and in a thousand ways separate the important from the unimportant. Our attention is selective of necessity, not only because we have a limited mental capacity, but because some facts are much more necessary than others for our survival.

We have evolved, not as impartial observers of the world, but as actors in a contest of life. It makes no difference, evolutionarily speaking, if our senses represent “accurately” what is out there in the world; it is only important that they alert us to threats and allow us to locate food. There is reason to believe, therefore, that our senses cannot be literally trusted, since they are adapted to survival, not truth.

Survival is, of course, not the operative motivation in scholarship. More generally, some facts are more interesting than others. Some things are interesting simply in themselves—charms that strike the sight, or merits that win the soul—while others are interesting in that they seem to hold within themselves the reason for many other events.

A history of the Italian Renaissance that gave equal space to a blacksmith as to Pope Julius II, or equal space to a parish church as to the Sistine Chapel, would be unsatisfactory, not because it was inaccurate, but because its priorities would be in disarray. All intellectual work requires judgment. A historian’s accuracy might be unimpeachable, and yet his judgment so faulty as to render his work worthless.

We have just introduced two vague concepts into our search for knowledge: interest and judgment—interest being the “inherent” value of a fact, and judgment our faculty for discerning interest. Both of these are clearly subjective concepts. So instead of impartially represented reality, our thinkers experience reality through a distorted lens—the lens of our senses, further shaped by culture and upbringing—and from this blurry image of the world, select what portion of that distorted reality they deem important.

Their opinion of what is beautiful, what is meritorious, what is crucial and what is peripheral, will be based on criteria—either explicit or implicit—that are not reducible to the content itself. In other words, our thinkers will be importing value judgments into their investigation, judgments that will act as sieves, catching some material and letting the rest slip by.

Even more perilous, perhaps, than the selection of facts, will be the forging of generalizations. Since, with our little brains, we simply cannot represent reality in all its complexity, we resort to general statements. These are statements about the way things normally happen, or the characteristics that things of the same type normally have—statements that attempt to summarize a vast number of particulars within one abstract tendency.

All generalizations employ inductive reasoning, and thus are vulnerable to Hume’s critique of induction. A thousand instances of red apples is no proof that the next apple will also be red. And even if we accept that generalizations are always more or less true—true as a rule, with some inevitable exceptions—this leaves undefined how well the generalization fits the particulars. Is it true nine times out of ten, or only seven? How many apples out of a hundred are red? Finally, to make a generalization requires selecting one quality—say, the color of apples, rather than their size or shape—among many that the particulars possess, and is consequently always arbitrary.

More hazardous still is the act of interpretation. By interpretation, I mean deciding what something means. Now, in some intellectual endeavors, such as the hard sciences, interpretation is not strictly necessary; only falsifiable knowledge counts. Thus, in quantum mechanics, it is unimportant whether we interpret the equations according to the Copenhagen interpretation or the Many-Worlds interpretation—whether the wave-function collapses, or reality splits apart—since in any case the equations predict the right result. In other words, we aren’t required to scratch our heads and ask what the equations “mean” if they spit out the right number; this is one of the strengths of science.

But in other fields, like history, interpretation is unavoidable. The historian is dealing with human language, not to mention the vagaries of the human heart. This alone makes any sort of “objective” knowledge impossible in this realm. Interpretation deals with meaning; meaning only exists in experience; experience is always personal; and the personal is, by definition, subjective. Two scholars may differ as to the meaning of, say, a passage in a diplomat’s diary, and neither could prove the other was incorrect, although one might be able to show her interpretation was far more likely than her counterpart’s.

Let me stop and review the many pitfalls on our road to perfect knowledge of the Italian Renaissance. First, we begin with an imperfect record of information; then we must make selections from this imperfect record. This selection will be based on vague judgments of importance and interest—what things are worth knowing, which facts explain other facts. We will also try to make generalizations about these facts—generalizations that are always tentative, arbitrary, and hazardous, and which are accurate to an undetermined extent. After during all this, we must interpret: What does this mean? Why did this happen? What is the crucial factor, what is mere surface detail? And remember that, before we even start, we are depending on a severely limited perception of the world, and a perspective warped by innumerable prejudices. Is it any wonder that scholarship goes on infinitely?

At this point, I am feeling a bit like Montaigne, chasing my thoughts left and right, trying to weave disparate threads into a coherent whole, and wondering how I began this already overlong essay. Well, that’s not so bad, I suppose, since Montaigne is the reason I am writing here in the first place.

Montaigne was a skeptic; he did not believe in the possibility of objective knowledge. For him, the human mind was too shifting, the human understanding too weak, the human lifespan too short, to have any hope of reaching a final truth. Our reasoning is always embodied, he observed, and is thus subjected to our appetites, excitements, passions, and fits of lassitude—to all of the fancies, hobbyhorses, prejudices, and vanities of the human personality.

You might think, from the foregoing analysis, that I take a similar view. But I am not quite so cheerfully resigned to the impossibility of knowledge. It is impossible to find out the absolute truth (and even if we could, we couldn’t be sure when or if we did). Through science, however, we have developed a self-correcting methodology that allows us to approach ever-nearer to the truth, as evidenced by our increasing ability to manipulate the world around us through technology. To be sure, I am no worshiper of science, and I think science is fallible and limited to a certain domain. But total skepticism regarding science would, I think, by foolish and wrong-headed: science does what it’s supposed to do.

What about domains where the scientific method cannot be applied, like history? Well, here more skepticism is certainly warranted. Since so much interpretation is needed, and since the record is so imperfect, conclusions are always tenuous. Nevertheless, this is no excuse to be totally skeptical, or to regard all conclusions as equally valid. The historian must still make logically consistent arguments, and back up claims with evidence; their theories must still plausibly explain the available evidence, and their generalizations must fit the facts available. In other words, even if a historian’s thesis cannot be falsified, it must still conform to certain intellectual standards.

Unlike in science, however, interpretation does matter, and it matters a great deal. And since interpretation is always subjective, this makes it possible for two historians to propose substantially different explanations for the same evidence, and for both of their theories to be equally plausible. Indeed, in a heuristic field, like history, there will be as many valid perspectives as there are practitioners.

This brings us back to Montaigne again. Montaigne used his skepticism—his belief in the subjectivity of knowledge, in the embodied nature of knowing—to justify his sort of dilettantism. Since nobody really knows what they’re talking, why can’t Montaigne take a shot? This kind of perspective, so charming in Montaigne, can be dangerous, I think, if it leads one to abandon intellectual standards like evidence and argument, or if it leads to an undiscerning distrust in all conclusions.

Universal skepticism can potentially turn into a blank check for fundamentalism, since in the absence of definite knowledge you can believe whatever you want. Granted, this would never have happened to Montaigne, since he was wise enough to be skeptical of himself above all; but I think it can easily befall the less wise among us.

Nevertheless, if proper respect is paid to intellectual standards, and if skepticism is always turned against oneself as well as one’s peers, then I think dilettantism, in Montaigne’s formulation, is not only acceptable but admirable:

I might even have ventured to make a fundamental study if I did not know myself better. Scattering broadcast a word here, a word there, examples ripped from their contexts, unusual ones, with no plan and no promises, I am under no obligation to make a good job of it nor even to stick to the subject myself without varying it should it so please me; I can surrender to doubt and uncertainty and to my master-form, which is ignorance.

Nowadays it is impossible to be an expert in everything. To be well-educated requires that we be dilettantes, amateurs, whether we want to or not. This is not to be wholly regretted, for I think the earnest dilettante has a lot to contribute in the pursuit of knowledge.

Serious amateurs (to use an oxymoron) serve as intermediaries between the professionals of knowledge and the less interested lay public. They also serve as a kind of check on professional dogmatism. Because they have one tiptoe in the subject, and the rest of their body out of it, they are less likely to get swept away by a faddish idea or to conform to academic fashion. In other words, they are less vulnerable to groupthink, since they do not form a group.

I think serious amateurs might also make a positive contribution, at least in some subjects that require interpretation. Although the amateur likely has less access to information and lacks the resources to carry out original investigation, each amateur has a perspective, a perspective which may be highly original; or she may notice something previously unnoticed, which puts old material in a new light.

Although respect must be paid to expertise, and academic standards cannot be lightly ignored, it is also true that professionals do not have a monopoly on the truth—and for all the reasons we saw above, absolute truth is unattainable, anyway—so there will always be room for fresh perspectives and highly original thoughts.

Montaigne is the perfect example: a sloppy thinker, a disorganized writer, a total amateur, who was nonetheless the most important philosopher and man of letters of his time.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s