Review: TITAN: the Life of John D. Rockefeller

Review: TITAN: the Life of John D. Rockefeller

Titan: The Life of John D. Rockefeller, Sr.Titan: The Life of John D. Rockefeller, Sr. by Ron Chernow

My rating: 4 of 5 stars

He played golf assiduously, always alone, matching his record on one day against his record on another; just what the saints do when they daily examine their conscience… Such was probably also the interest dominating Rockefeller’s chase after millions. He was beyond comparing himself with his competitors; he compared himself with himself.

—George Santayana

As a child of Sleepy Hollow, I have almost literally grown up in Rockefeller’s shadow. The best walking paths in the area are in the Rockefeller State Park Preserve, an expansive and beautiful slice of forest made from a part of Rockefeller’s former estate. I can also walk to Rockwood, a park with a gorgeous view of the Hudson River, where John’s brother William had his mansion (since demolished). John D. Rockefeller’s own mansion, Kykuit, sits atop the nearby Pocantico Hills, and is a popular tourist destination. And yet, aside from his reputation as an ultra-rich monopolist, I knew almost nothing about the man.

Thus I turned to Ron Chernow, and I am glad I did. For Rockefeller presents a challenging subject for would-be biographers. A private, reserved, and even a secretive man, John D. Rockefeller was a beguiling mixture of avarice and piety; and throughout his life he has provoked both passionate praise and vicious criticism. Since Rockefeller himself was so guarded during his lifetime, never spontaneous or candid, while achieving such historical importance, it is hard to resist the urge to simplify his character—merely to fill up the lacunae he left. Luckily, Chernow’s patience and sensitivity allow him to paint a convincing and unforgettable portrait of this evasive figure.

As Chernow himself says, Rockefeller was the walking embodiment of Max Weber’s Protestant Ethic. He was actuated by a faith which told him that it was his holy duty to work zealously, and which taught him to see his own success as divine favor and his rivals’ failure as divine retribution. This faith in his mission and his rectitude gave him a purpose and a justification, pushing him to work more devotedly than his colleagues, and to feel no pangs of remorse for those he bruised along the way. His outstanding strengths were his iron will and his extreme deliberation. He kept to a rigid schedule, never acted impulsively, tabulated all of his personal expenses in a little booklet, and even showed up to work on his wedding day. This was a man who made money with the morbid devotion of a saint.

During the sections charting Rockefeller’s rise to success, I was filled with a horrified disgust with the man. Such a joyless, self-righteous hypocrite—filling his pockets with gold and wagging his fingers at the poor. I did not see anything to praise in his religion of money. Simple greed is noxious enough, but sanctimonious greed is revolting.

Yet by the end of the book I found that I both liked and admired the man, or at least the man he later became. For Rockefeller, while full of his own vices, was free of many of the vices we associate with the rich. He was neither ostentatious nor profligate; and if his puritan strictness seems joyless—his hatred of drink, cards, smoking, or anything remotely racy—it at least saved him from hedonistic debauchery. And as he grew older, he became more playful, giving away dimes to strangers, riding around in sporty automobiles, and obsessively playing golf. I was surprised to learn that Rockefeller retired early from his post at the helm of Standard Oil, ceasing all regular duties in his fifties, only retaining a symbolic title. Clearly, he saw more to life than work and money.

But Rockefeller’s greatest virtue was his charity. He gave profusely and generously throughout his life, even more than Andrew Carnegie. Much of this was new to me (for example, I had no idea he founded the University of Chicago); and this is no accident, since Rockefeller did not like putting his name on things. (His name was so vilified anyway it would likely have hampered his charities.) And contrary to what you might expect, Rockefeller’s philanthropic impulse was deep and genuine, something he had from the beginning of his life. According to Chernow, Rockefeller’s contributions to medical research revolutionized the field. So on a purely utilitarian tabulation of pain and pleasure inflicted, Rockefeller probably comes out positive in the end. (Rockefeller himself, of course, thought that his life had been virtuous from beginning to end, and never conceived charity as recompense.)

As I hope I have made clear, Rockefeller was a complex man—or, perhaps it is more accurate to say that he continually resists attempts to stereotype him, which is always uncomfortable. And it is a testament to Chernow’s ability that he captures Rockefeller in all these aspects. Now, this was my first Chernow biography and, I admit, I was somewhat disappointed at first. Naturally, I measured this book against Robert Caro’s The Power Broker, and found Chernow’s book very thin on historical background by comparison. But Chernow partially compensates for this with his fine psychological sensitivity, as sharp as a first-rate novelist. The result is a thoroughly engrossing biography, so good that I am left wishing Chernow had made it longer—specifically during Rockefeller’s early years. And you know a book is good when 700 pages does not satisfy.

 


(As an afterthought, I would like to note how gratifying it is when different books serendipitously overlap. I knew of Charles Strong as one of George Santayana’s best friends, familiar to me from Santayana’s autobiography and his letters. But I did not remember that Strong married Bessie Rockefeller, John’s eldest child, who went insane and died at the age of forty. Santayana helped to look after Bessie’s daughter, Margaret, and even handed her off during her wedding.)

View all my reviews

Review: Sidereal Messenger

Review: Sidereal Messenger

Sidereus Nuncius, or The Sidereal MessengerSidereus Nuncius, or The Sidereal Messenger by Galileo Galilei

My rating: 4 of 5 stars

A most excellent a kind service has been performed by those who defend from envy the great deeds of excellent men and have taken it upon themselves to preserve from oblivion and ruin names deserving of immortality.

This book (more of a pamphlet, really) is proof that you do not need to write many pages to make a lasting contribution to science. For it was in this little book that Galileo set forth his observations made through his newly improved telescope. In 50-odd pages, with some accompanying diagrams and etchings, Galileo quickly asserts the roughness of the Moon’s surface, avers the existence of many more stars than can be seen with the naked eye, and—the grand climax—announces the existence of the moons of Jupiter. Suddenly the universe seemed far bigger, and stranger, than it had before.

The actual text of Siderius Nuncius does not make for exciting reading. To establish his credibility, Galileo includes a blow-by-blow account of his observations of the moons of Jupiter, charting their nightly appearance. The section on our Moon is admittedly more compelling, as Galileo describes the irregularities he observed as the sun passed over its surface. Even so, this edition is immeasurably improved by the substantial commentary provided by Albert van Helden, who gives us the necessary historical background to understand why it was so controversial, and charts the aftermath of the publication.

Though Galileo is sometimes mistakenly credited with inventing the telescope, spyglasses were widely available at the time; what Galileo did was improve his telescope far beyond the magnification commonly available. The result was that, for a significant span of time, Galileo was the only person on the planet with the technology to closely and accurately observe the heavens. The advantage was not lost on him, and he made sure that he published before he got scooped. In another shrewd move, he named the newly-discovered moons of Jupiter after the Grand Duke Cosimo II and his brothers, for which they were known as the Medician Stars (back then, the term “star” meant any celestial object). This earned him patronage and protection.

Galileo’s findings were controversial because none of them aligned with the predictions of Aristotelian physics and Ptolemaic astronomy. According to the accepted view, the heavens were pure and incorruptible, devoid of change or imperfection. Thus it was jarring to find the moon’s surface bumpy, scarred, and mountainous, just like Earth’s. Even more troublesome were the Galilean moons. In the orthodox view the Earth was the only center of orbit; and one of the strongest objections against Copernicus’s system was that it included two centers, the Sun and the Earth (for the Moon). Galileo’s finding of an additional center of orbit meant that this objection ceased to carry any weight, since in any case we must posit multiple centers. Understandably there was a lot of skepticism at first, with some scholars doubting the efficacy of Galileo’s new instrument. But as other telescopes caught up with Galileo’s, and new anomalies were added to the mix—the phases of Venus and the odd shape of Saturn—his observations achieved widespread acceptance.

Though philosophers and historians of science often emphasize the advance of theory, I find this text a compelling example of the power of pure observation. For Galileo’s breakthrough relied, not on any new theory, but on new technology, extending the reach of his senses. He had no optical theory to guide him as he tinkered with his telescope, relying instead on simple trial-and-error. And though theory plays a role in any observation, some of Galileo’s findings—such as that the Milky Way is made of many small stars clustered together—are as close to simple acts of vision as possible. Even if Copernicus’s theory was not available as an alternative paradigm, it seems likely to me that advances in the power of telescopes would have thrown the old worldview into a crisis. This goes to show that observational technology is integral to scientific progress.

It is also curious to note the moral dimension of Galileo’s discovery. Now, the Ptolemaic system is commonly lambasted as narcissistically anthropocentric, placing humans at the center of it all. Yet it is worth pointing out that, in the Ptolemaic system, the heavens are regarded as pure and perfect, and everything below the moon as corruptible and imperfect (from which we get the term “sublunary”). Indeed, Dante placed the circles of paradise on the moon and the planets. So arguably, by making Earth the equal of the other planets, the new astronomy actually raised the dignity of our humble abode. In any case, I think that it is simplistic to characterize the switch from geocentricity to heliocentricity as a tale of declining hubris. The medieval Christians were hardly swollen with pride by their cosmic importance.

As you can see, this is a fascinating little volume that amply rewards the little time spent reading it. Van Helden has done a terrific job in making this scientific classic accessible.

View all my reviews

Review: Almagest

Review: Almagest
The Almagest: Introduction to the Mathematics of the Heavens

The Almagest: Introduction to the Mathematics of the Heavens by Ptolemy

My rating: 4 of 5 stars

… it is not fitting even to judge what is simple in itself in heavenly things on the basis of things that seem to be simple among us.

In my abysmal ignorance, I had for years assumed that tracking the orbits of the sun and planets would be straightforward. All you needed was a starting location, a direction, and the daily speed—and, with some simple arithmetic and a bit of graph paper, it would be clear as day. Attempting to read Ptolemy has revealed the magnitude of my error. Charting the heavenly bodies is a deviously complicated affair; and Ptolemy’s solution must rank as one of the greatest intellectual accomplishments of antiquity—fully comparable with the great scientific achievements of European Enlightenment. Indeed, Otto Neugebauer, the preeminent scholar of ancient astronomy, went so far as to say:

One can perfectly well understand the ‘Principia’ without much knowledge of earlier astronomy but one cannot read a single chapter in Copernicus or Kepler without a thorough knowledge of Ptolemy’s “Almagest”. Up to Newton all astronomy consists in modifications, however ingenious, of Hellenistic astronomy.

With more hope than sense, I cracked open my copy of The Great Books of the Western World, which has a full translation of the Almagest in the 16th volume. Immediately repulsed by the text, I then acquired a students’ edition of the book published by the Green Lion Press. This proved to be an excellent choice. Through introductions, preliminaries, footnotes, and appendices—not to mention generous omissions—this edition attempts to make Ptolemy accessible to a diligent college student. Even so, for someone with my background to attain a thorough knowledge of this text, he would still require months of dedicated study with a teacher as a guide. For the text is difficult in numerous ways.

Most obviously, this book is full of mathematical proofs and calculations, which are not exactly my strong suit. Ptolemy’s mathematical language—relying on the Greek geometrical method—will be unfamiliar to students who have not read some Euclid; and even if it is familiar, it proves cumbrous for the sorts of calculations demanded by the subject. To make matters worse, Ptolemy employs the sexagesimal system (based on multiples of 60) for fractions; so his numbers all must be converted into our decimals for calculation. What is more, even the names of the months Ptolemy uses are different, bearing their Egyptian names (Thoth, Phaöphi, Athur, etc.), since Ptolemy was an Alexandrian Greek. Yet even if we put all these technical obstacles to the side, we are left with Ptolemy’s oddly infelicitous prose, which the translator describes thus:

In general, there is a sort of opacity, even awkwardness, to Ptolemy’s writing, especially when he is providing a larger frame for a topic or presenting a philosophical discussion.

Thus, even in the non-technical parts of the book, Ptolemy’s writing tends to be headache-inducing. All this combines to form an unremitting slog. So since my interest in this book was amateurish, I skimmed and skipped liberally. Yet this text is so rich that, even proceeding in such a dilettantish fashion, I managed to learn a great deal.

Ptolemy’s Almagest, like Euclid’s Elements, proved so comprehensive and conclusive when it was published that it rendered nearly all previous astronomical work obsolete or superfluous. For this reason, we know little about Ptolemy’s predecessors, since there was little point in preserving their work after Ptolemy summed it up in such magnificent fashion. As a result it is unclear how much of this book is original and how much is simply adapted. As Ptolemy himself admits, he owes a substantial debt to the astronomer Hipparchus, who lived around 200 years earlier. Yet it does seem that Ptolemy originated the novel way of accounting for the planets’ position and speed, which he puts forth in later books.

Ptolemy begins by explaining the method by which he will measure chords; this leads him to construct one of the most precise trigonometric tables from antiquity. Later, Ptolemy goes on to produce several proofs of spherical trigonometry, which allows him to measure distances on the inside of a sphere, making this book an important source for Greek trigonometry as well as astronomy. Ptolemy also employs Menelaus’ Theorem, which uses the fixed proportions of a triangle to establish ratios. (From this I see that triangles are marvelously useful shapes, since they are the only shape which is rigid—that is, the angles cannot be altered without also changing the ratio of the sides, and vice versa. This is also, by the way, what makes triangles such strong structural components.)

Ptolemy gets down to business by analyzing the sun’s motion. This is tricky for several reasons. For one, the sun does not travel parallel to the “fixed stars” (so called because the stars do not position change relative to one another), but rather at an angle, which Ptolemy calculates to be around 23 degrees. We now know this is due to earth’s axial tilt, but for Ptolemy it was called the obliquity of the ecliptic (the angle of the sun’s path). Also, the angle that the sun travels through the sky (straight overhead or nearer the horizon) is determined by one’s latitude; this also determines the seasonal shifts in day-length; and during these shifts, the sun rises on different points on the horizon. To add to these already daunting variables, the sun also shifts in speed during the course of the year. And finally, Ptolemy had to factor in the procession of the equinoxes—the ecliptic’s gradual westward motion from year to year.

The planets turn out to be even more complex. For they all exhibit anomalies in their orbits which entail further complications. Venus, for example, not only speeds up and slows down, but also seems to go forwards and backwards along its orbit. This leads Ptolemy to the adoption of epicylces—little circles which travel along the greater circle, called the “deferent,” of the planet’s orbit. But to preserve the circular motion of the deferent, Ptolemy must place the center (called the “eccentric”) away from earth, in empty space. Then, Ptolemy introduces another imaginary circle, around which the planet travels with constant velocity: and the center of this is called the “equant,” which is also in empty space. Thus the planet’s motion was circular around one point (the eccentric) and constant around another circle (the equant), neither of which coincide with earth (so much for geocentric astronomy). In addition to all this, the orbit of Venus is not exactly parallel with the sun’s orbit, but tilted, and its tilt wobbles throughout the year. For Ptolemy to account for all this using only the most primitive observational instruments and without the use of calculus or analytic geometry is an extraordinary feat of patience, vision, and drudgery.

Even after writing all this, I am not giving a fair picture of the scope of Ptolemy’s achievement. This book also includes an extensive star catalogue, with the location and brightness of over one thousand stars observable with the naked eye. He argues strongly for earth’s sphericity (so much for a flat earth) and even offers a calculation of earth’s diameter (which was 28% too small). Ptolemy also calculates the distance from the earth to the moon, using the lunar parallax (the difference in the moon’s appearance when seen from different positions on earth), which comes out the quite accurate figure of 59 earth radii. And all of this is set forth in dry, sometimes baffling prose, accompanied by pages of proofs and tables. One can see why later generations of astronomers thought there was little to add to Ptolemy’s achievement, and why Arabic translators dubbed it “the greatest” (from which we get the English name).

A direct acquaintance with Ptolemy belies his popular image as a metaphysical pseudo-scientist, foolishly clinging to a geocentric model, using ad-hoc epicycles to account for deviations in his theories. To the contrary, Ptolemy scarcely ever touches on metaphysical or philosophical arguments, preferring to stay in the precise world of figures and proofs. And if science consists in predicting phenomena, then Ptolemy’s system was clearly the best scientific theory around for its range and accuracy. Indeed, a waggish philosopher might dismiss the whole question of whether the sun or the earth was at the “center” as entirely metaphysical (is it falsifiable?). Certainly it was not mere prejudice that kept Ptolemy’s system alive for so long.

Admittedly, Ptolemy does occasionally include airy metaphysical statements:

We propose to demonstrate that, just as for the sun and moon, all the apparent anomalistic motions of the five planets are produced through uniform, circular motions; these are proper to the nature of what is divine, but foreign to disorder and variability.

Yet notions of perfection seem hard to justify, even within Ptolemy’s own theory. The combined motions of the deferent and the epicycle do not make a circle, but a wavy shape called an epitrochoid. And the complex world of interlocking, overlapping, slanted circles—centered on imaginary points, riddled with deviations and anomalies—hardly fits the stereotypical image of an orderly Ptolemaic world.

It must be said that Ptolemy’s system, however comprehensive, does leave some questions tantalizingly unanswered. For example, why do Mercury and Venus stay within a definite distance from the sun, and travel along at the same average speed as the sun? And why are the anomalies of the “outer planets” (Mars, Jupiter, Saturn) sometimes related to the sun’s motion, and sometimes not? All this is very easy to explain in a heliocentric model, but rather baffling in a geocentric one; and Ptolemy does not even attempt an explanation. Even so, I think any reader of this volume must come to the conclusion that this is a massive achievement—and a lasting testament to the heights of brilliance and obscurity that a single mind can reach.

View all my reviews

Review: Life on the Mississippi

Review: Life on the Mississippi

Life on the MississippiLife on the Mississippi by Mark Twain

My rating: 3 of 5 stars

And, mind you, emotions are among the toughest things in the world to manufacture out of whole cloth; it is easier to manufacture seven facts than one emotion.

This is an awkward book to review, since it consists of so many, varied sections. Yet it can be neatly divided between the first third and the remaining portion. After a few brief chapters about the mighty river and its history, the beginning section focuses on Twain’s young days as a steersman aboard Mississippi River steamboats. These are easily the best pages. As evinced by the Huckleberry Finn stories, Twain had a marvelous way of writing from a child’s perspective, naively learning to navigate the world. What is more, Twain does an excellent job in illustrating the extensive knowledge necessary to effectively pilot a steamboat—memorizing hundreds of landmarks, learning how to gauge speed and depth, and dealing with difficult coworkers.

The second section is a meandering account of a voyage he took two decades after leaving the steamboat business, when he was an accomplished author. At this point he was already so famous he had to adopt a pseudonym. Here he pauses so often to lose himself in tributary wanderings that the narrative breaks down into a vaguely connected series of anecdotes, most of which seem obviously inflated or simply fictional. Though there is much to amuse in this section, I found myself growing increasingly restless and bored as I continued on, eager for the end. Though I did not dislike this book as much as I did A Connecticut Yankee, I nevertheless felt that the joke had gone stale and that Twain was merely filling up space.

My reactions to Twain tend to shift violently. Again, in the beginning section of this work, when he is writing from the perspective of his younger self, his writing is energetic and witty and wide-eyed. But when he dons the cap of a raconteur, I tend to find his stories mechanical and dull. His account of the Pilots’ Association is an excellent example of this—proceeding in predictable steps to the inevitable conclusion. And when he shifts away from humor, the results can be pretty grim. His flat-footed tall tale of the man who sought revenge for his murdered family—a mix of the ghoulish and the sentimental—is an excellent example of this.

Even with these faults and lapses, this book is an unforgettable portrait of a time and place that are gone for good, written by an indefatigably mordant pen.

View all my reviews

Review: The New Organon

Review: The New Organon

The New OrganonThe New Organon by Francis Bacon

My rating: 4 of 5 stars

Since I’ve lately read Aristotle’s original, I thought I’d go ahead and read Bacon’s New Organon. The title more or less says it all. For this book is an attempt to recast the method of the sciences in a better mold. Whereas Aristotle spends pages and pages enumerating the various types of syllogisms, Bacon dismisses it all with one wave of the hand—away with such scholarly nonsense! Because Aristotle is so single-mindedly deductive, his scientific research came to naught; or, as Bacon puts it, “Aristotle, who made his natural philosophy a mere bond servant to his logic, thereby [rendered] it contentious and well-nigh useless.”

What is needed is not deduction—which draws trivial conclusions form absurd premises—but induction. More specifically, what is needed is a great deal of experiments, the results of which the careful scientist can sort into air-tight conclusions. Down with the syllogism; up with experiment. Down with the schoolmen; up with the scientists.

In my (admittedly snotty) review of Bacon’s Essays, I remarked that he would have done better to have written a work entirely in aphorisms. Little did I know that Bacon did just that, and it is this book. Whatever Bacon’s defects were as a politician or a philosopher, Bacon is the undisputed master of the pithy, punchy maxim. In fact, his writing style can be almost sickening, so dense is it with aphorism, so rich is it with metaphor, so replete is it with compressed thought.

In the first part of his New Organon all of the defects of Bacon’s style are absent, and all of his strengths are present in full force. Indeed, if this work consisted of only the first part, it would have merited five stars, for it is a tour de force. Bacon systematically goes through all of the errors the human mind is prone to when investigating nature, leaving no stone unturned and no vices unexamined, damning them all in epigram after epigram. The reader hardly has time to catch his breath from one astonishing insight, when Bacon is on to another.

Among these insights are, of course, Bacon’s famous four idols. We have the Idol of the Tribe, which consist of the errors humans are wont to make by virtue of their humanity. For our eyes, our ears, and our very minds distort reality in a systematic way—something earlier philosophers had, so far as I know, neglected to account for. We have then the Idols of the Cave, which are the foibles of the individual person, over and above the common limitations of our species. Of these may include certain pet theories, preferences, accidents of background, peculiarities of taste. And then finally we have the Idols of the Market Place, which are caused by the deceptive nature of language and words, as well as the Idols of the Theater, which consists of the various dogmas present in the universities and schools.

Bacon also displays a remarkable insight into psychology. He points out that humans are pattern-seeking animals, which leads us to sometimes see patterns which aren’t there: “The human understanding is of its own nature prone to suppose the existence of more order and regularity in the world than it finds.” Bacon also draws the distinction, made so memorable in Isaiah Berlin’s essay, between foxes and hedgehogs: “… some minds are stronger and apter to mark the differences of things, others to mark their resemblances.” Bacon also notes, in terms no psychologist could fault, a description of confirmation bias:

The human understanding when it has once adopted an opinion (either as being the received opinion or as being agreeable to itself) draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, or else by some distinction sets aside and rejects, in order that by this great and pernicious predetermination the authority of its former conclusions may remain inviolate.

Part two, on the other hand, is a tedious, rambling affair, which makes the patient reader almost forget the greatness of the first half. Here, Bacon moves on from condemning the errors of others to setting up his own system. In his opinion, scientific enquiry is a simple matter of tabulation: make a table of every situation in which a given phenomenon is always found, and then make a table of every situation in which a given phenomenon is never found; finally, make a table of every situation in which said phenomenon is sometimes found, shake well, and out comes your answer.

The modern reader will not recognize the scientific method in this process. For we now know that Bacon’s induction is not sufficient. (Though, he does use his method to draw an accurate conclusion about the nature of heat: “Heat is a motion, expansive, restrained, and acting in its strife upon the smaller particles of bodies.”) What Bacon describes is more or less what we’d now call ‘natural history’, a gathering up of facts and a noting of regularities. But the scientific method proper requires the framing of hypotheses. The hypothesis is key, because it determines what facts need to be collected, and what relationship those facts will have with the theory in question. Otherwise, the buzzing world of facts is too lush and fecund to tabulate; there are simply too many facts. Furthermore, Bacon makes the somewhat naïve—though excusable, I think—assumption that a fact is simply a fact, whereas we now know that facts are basically meaningless unless contextualized; and, in science, it is the theory in question which contextualizes said facts.

The importance of hypotheses also makes deduction far more important than Bacon acknowledges. For the aspiring experimentalist must often go through a long chain of deductive reasoning before he can determine what experiment should be performed in order to test a theory. In short, science relies on both deductive and inductive methods, and the relationship of theory to data is far more intertwined than Bacon apparently thinks. (As a side note, I’d also like to point out that Bacon wasn’t much of a scientist himself; he brings up the Copernican view of the heliocentric solar system many times, only to dismiss it as ridiculous, and also seems curiously unaware of the other scientific advances of his day.)

In a review of David Hume’s Enquiry Concerning the Principles of Morals, I somewhat impertinently remarked that the English love examples—or, to use a more English word, instances. I hope not to offend any English readers, but Bacon confirms me in this prejudice—for the vast bulk of this work is a tedious enumeration of twenty-seven (yes, that’s almost thirty) types of ‘instances’ to be found in nature. Needless to say, this long and dry list of the different sorts of instances makes for both dull reading and bad philosophy, for I doubt any scientist in the history of the world ever made progress by sorting his results into one of Bacon’s categories.

So the brilliant, brash, and brazen beginning of this book fizzles out into pedantry that, ironically enough, rivals even Aristotle’s original Organon. So, to repeat myself, the title of this book more or less says it all.

View all my reviews

Review: Organon (Aristotle)

Review: Organon (Aristotle)

OrganonOrganon by Aristotle

My rating: 3 of 5 stars

Aristotle continues to provoke conflicting reactions in me. I am always torn between realizing his tremendous originality and historical importance, and suffering from his extraordinary dullness. This book exemplifies both sides of the coin. Seeing a man single-handedly create the field of logic, ex nihilo, is tremendous; yet reading through these treatises could put a coffee-addict in a coma.

I am not insensitive to the appeals of philosophy. Far from it; I think reading philosophy is thrilling. Some of my most acute aesthetic experiences have been had contemplating some philosopher’s idea. Yet I have never had this reaction to Aristotle’s writings. Part of this is due to his formidable difficulty; another part, to the nature of the works (which, I must constantly remind myself, are lecture-notes).

Nevertheless, Aristotle had a prosaic mind; even when faced with the most abstract phenomena in the universe, his first reaction is to start parceling everything into neat categories, and to go on making lists and explanations of these categories. He does make logical arguments, but they are often brief, and almost as often unsatisfactory. Much of the time the student is faced with the dreary task of working his way through Aristotle’s system, simply because it is his system, and not because it is empirically or logically compelling.

(Every time I write a review for Aristotle, it comes out so disappointed. Let me try to be more positive.)

My favorite piece in this was the Posterior Analytics, which is a brilliant treatise on epistemology, logic, and metaphysics. Aristotle succinctly presents an entire theory of knowledge, and it’s incomparably more rigorous and detailed than anything Plato could have produced. I also particularly liked the Topics, as there we see Aristotle as a seasoned debater, in addition to a bumbling professor. Of course, there is much of strictly philosophic interest in this work as well; a particularly memorable problem is that of the future naval-battle.

For me, Aristotle is at his best when he is discussing the acquisition of knowledge. For Aristotle, whatever his faults, more perfectly embodied the love of knowledge than any other thinker in history. He wanted to know all; and, considering his historical limitations, he came damn near close. Us poor moderns have to content ourselves with either a mastery of one tiny slice of reality, or a dilettante acquaintance with all of it; Aristotle had the whole world at his fingertips.

View all my reviews

Review: Autobiography (Darwin)

Review: Autobiography (Darwin)
The Autobiography of Charles Darwin, 1809–82

The Autobiography of Charles Darwin, 1809–82 by Charles Darwin

My rating: 4 of 5 stars

I have attempted to write the following account of myself, as if I were a dead man in another world looking back at my own life. Nor have I found this difficult, for life is nearly over with me. I have taken no pains about my style of writing.

This is the quintessential scientific autobiography, a brief and charming book that Darwin wrote “for nearly an hour on most afternoons” for a little over two months. Originally published in 1887—five years after the naturalist’s death—it was somewhat censored, the more controversial religious opinions being taken out. It was only in 1958, to celebrate the centennial of The Origin of Species, that the full version was restored, edited by one of Darwin’s granddaughters, Nora Barlow.

The religious opinions that Darwin expresses are, nowadays, not enough to raise eyebrows. In short, his travels and his research slowly eroded his faith until all that remained was an untroubled agnosticism. What is interesting is that Darwin attributes to his loss of faith his further loss of sensitivity to music and to grand natural scenes. Apparently, in later life he found himself unable to experience the sublime. His scientific work also caused him to lose his appreciation for music, pictures, and poetry, which he heartily regrets: “My mind seems to have become a kind of machine for grinding general laws out of large collections of facts,” he says, and attributes to this the fact that “for many years I cannot endure to read a line of poetry.”

The most striking and lovable of Darwin’s qualities is his humility. He notes his lack of facility with foreign languages (which partially caused him to refuse Marx’s offer to dedicate Kapital to him), his terrible ear for music, his difficulty with writing, his incompetence in mathematics, and repeatedly laments his lack of higher aesthetic sensitivities. His explanation for his great scientific breakthrough is merely a talent for observation and dogged persistence. He even ends the book by saying: “With such moderate abilities as I possess, it is truly surprising that thus I should have influenced to a considerable extent the beliefs of scientific men on some important point.” It is remarkable that such a modest and retiring man should have stirred up one of the greatest revolutions in Western thought. Few thinkers have been more averse to controversy.

This little book also offers some reflection on the development of his theory—with the oft-quoted paragraph about reading Malthus—as well as several good portraits of contemporary thinkers. But the autobiography is not nearly as full as one might expect, since Darwin skips over his voyage on the Beagle (he had already written an excellent book about it) and since the second half of his life was extremely uneventful. For Darwin developed a mysterious ailment that kept his mostly house-bound, so much so that he did not even go to his father’s funeral. The explanation eluded doctors in his time and has resisted firm diagnosis ever since. But the consensus seems to be that it was at least in part psychological. It did give Darwin a convenient excuse to avoid society and focus on his work.

The final portrait which emerges is that of a scrupulous, methodical, honest, plainspoken, diffident, and level-headed fellow. It is easy to imagine him as a retiring uncle or a reserved high school teacher. That such a man, through a combination of genius and circumstance—and do not forget that he almost did not go on that famous voyage—could scandalize the public and make a fundamental contribution to our picture of the universe, is perhaps the greatest argument that ever was against the eccentric genius trope.

View all my reviews

Review: The Structure of Scientific Revolutions

Review: The Structure of Scientific Revolutions

The Structure of Scientific RevolutionsThe Structure of Scientific Revolutions by Thomas S. Kuhn

My rating: 5 of 5 stars

Observation and experience can and must drastically restrict the range of admissible scientific belief, else there would be no science. But they cannot alone determine a particular body of such belief. An apparently arbitrary element, compounded of personal and historical accident, is always a formative ingredient of the beliefs espoused by a given scientific community at a given time.

This is one of those wonderfully rich classics, touching on many disparate fields and putting forward ideas that have become permanent fixtures of our mental furniture. Kuhn synthesizes insights from history, sociology, psychology, and philosophy into a novel conception of science—one which, despite seemingly nobody agreeing with it, has become remarkably influential. Indeed, this book made such an impact that the contemporary reader may have difficulty seeing why it was so controversial in the first place.

Kuhn’s fundamental conception is of the paradigm. A paradigm is a research program that defines a discipline, perhaps briefly, perhaps for centuries. This is a not only a dominant theory, but a set of experimental methodologies, ontological commitments, and shared assumptions about standards of evidence and explanation. These paradigms usually trace their existence to a breakthrough work, such as Newton’s Principia or Lavoisier’s Elements; and they persist until the research program is thrown into crisis through stubborn anomalies (phenomena that cannot be accounted for within the theory). At this point a new paradigm may arise and replace the old one, such as the switch from Newton’s to Einstein’s system.

Though Kuhn is often spoken of as responding to Popper, I believe his book is really aimed at undermining the old positivistic conception of science: where science consists of a body of verified statements, and discoveries and innovations cause this body of statements to gradually grow. What this view leaves out is the interconnection and interdependence between these beliefs, and the reciprocal relationship between theory and observation. Our background orients our vision, telling us where to look and what to look for; and we naturally do our best to integrate a new phenomenon into our preexisting web of beliefs. Thus we may extend, refine, and elaborate our vision of the world without undermining any of our fundamental theories. This is what Kuhn describes as “normal science.”

During a period of “normal science” it may be true that scientific knowledge gradually accumulates. But when the dominant paradigm reaches a crisis, and the community finds itself unable to accommodate certain persistent observations, a new paradigm may take over. This cannot be described as a mere quantitative increase in knowledge, but is a qualitative shift in vision. New terms are introduced, older ones redefined; previous discoveries are reinterpreted and given a new meaning; and in general the web of connections between facts and theories is expanded and rearranged. This is Kuhn’s famous “paradigm shift.” And since the new paradigm so reorients our vision, it will be impossible to directly compare it with the older one; it will be as if practitioners from the two paradigms speak different languages or inhabit different worlds.

This scandalized some, and delighted others, and for the same reason: that Kuhn seemed to be arguing that scientific knowledge is socially solipsistic. That is to say that scientific “truth” was only true because it was given credence by the scientific community. Thus no paradigm can be said to be objectively “better” than another, and science cannot be said to really “advance.” Science was reduced to a series of fashionable ideas.

Scientists were understandably peeved by the notion, and social scientists concomitantly delighted, since it meant their discipline was at the crux of scientific knowledge. But Kuhn repeatedly denied being a relativist, and I think the text bears him out. It must be said, however, that Kuhn does not guard against this relativistic interpretation of his work as much as, in retrospect, he should have. I believe this was because Kuhn’s primary aim was to undermine the positivistic, gradualist account of science—which was fairly universally held in the past—and not to replace it with a fully worked-out theory of scientific progress himself. (And this is ironic since Kuhn himself argues that an old paradigm is never abandoned until a new paradigm takes its place.)

Though Kuhn does say a good deal about this, I think he could have emphasized more strongly the ways that paradigms contribute positively to reliable scientific knowledge. For we simply cannot look on the world as neutral observers; and even if we could, we would not be any the wiser for it. The very process of learning involves limiting possibilities. This is literally what happens to our brains as we grow up: the confused mass of neural connections is pruned, leaving only the ones which have proven useful in our environment. If our brains did not quickly and efficiently analyze environmental stimuli into familiar categories, we could hardly survive a day. The world would be a swirling, jumbled chaos.

Reducing ambiguities is so important to our survival that I think one of the primary functions of human culture is to further eliminate possibilities. For humans, being born with considerable behavioral flexibility, must learn to become inflexible, so to speak, in order to live effectively in a group. All communication presupposes a large degree of agreement within members of a community; and since we are born lacking this, we must be taught fairly rigid sets of assumptions in order to create the necessary accord. In science this process is performed in a much more formalized way, but nevertheless its end is the same: to allow communication and cooperation via a shared language and a shared view of the world.

Yet this is no argument for epistemological relativism, any more than the existence of incompatible moral systems is an argument for moral relativism. While people commonly call themselves cultural relativists when it comes to morals, few people are really willing to argue that, say, unprovoked violence is morally praiseworthy in certain situations. What people mean by calling themselves relativists is that they are pluralists: they acknowledge that incompatible social arrangements can nevertheless be equally ethical. Whether a society has private property or holds everything in common, whether it is monogamous or polygamous, whether burping is considered polite or rude—these may vary, and yet create coherent, mutually incompatible, ethical systems. Furthermore, acknowledging the possibility of equally valid ethical systems also does not rule out the possibility of moral progress, as any given ethical system may contain flaws (such as refusing to respect certain categories of people) that can be corrected over time.

I believe that Kuhn would argue that scientific cultures may be thought of in the same pluralistic way: paradigms can be improved, and incompatible paradigms can nevertheless both have some validity. Acknowledging this does not force one to abandon the concept of “knowledge,” any more than acknowledging cultural differences in etiquette forces one to abandon the concept of “politeness.”

Thus accepting Kuhn’s position does not force one to embrace epistemological relativism—or, at least not the strong variety, which reduces knowledge merely to widespread belief. I would go further, and argue that Kuhn’s account of science—or at least elements of his account—can be made to articulate even with the system of his reputed nemesis, Karl Popper. For both conceptions have the scientist beginning, not with observations and facts, but with certain arbitrary assumptions and expectations. This may sound unpromising; but these assumptions and expectations, by orienting our vision, allow us to realize when we are mistaken, and to revise our theories. The Baconian inductivist or the logical positivist, by beginning with an raw mass of data, has little idea how to make sense of it and thus no basis upon which to judge whether an observation is anomalous or not.

This is not where the resemblance ends. According to both Kuhn and Popper (though the former is describing while the second is prescribing), when we are revising our theories we should if possible modify or discard the least fundamental part, while leaving the underlying paradigm unchanged. This is Kuhn’s “normal science.” So when irregularities were observed in Uranus’ orbit, the scientists could have either discarded Newton’s theories (fundamental to the discipline) or the theory that Uranus was the furthest planet in the solar system (a superficial fact); obviously the latter was preferable, and this led to the discovery of Neptune. Science could not survive if scientists too willingly overturned the discoveries and theories of their discipline. A certain amount of stubbornness is a virtue in learning.

Obviously, the two thinkers also disagree about much. One issue is whether two paradigms can be directly compared or definitively tested. Popper envisions conclusive experiments whose outcome can unambiguously decide whether one paradigm or another is to be preferred. There are some difficulties to this view, however, which Kuhn points out. One is that different paradigms may attach very different importance to certain phenomena. Thus for Galileo (to use Kuhn’s example) a pendulum is a prime exemplar of motion, while to an Aristotelian a pendulum is a highly complex secondary phenomenon, unfit to demonstrate the fundamental properties of motion. Another difficulty in comparing theories is that terms may be defined differently. Einstein said that massive objects bend space, but Newtonian space is not a thing at all and so cannot be bent.

Granting the difficulties of comparing different paradigms, I nevertheless think that Kuhn is mistaken in his insistence that they are as separate as two languages. I believe his argument rests, in part, on his conceiving of a paradigm as beginning with definitions of fundamental terms (such as “space” or “time”) which are circular (such as “time is that measured by clocks,” etc.); so that comparing two paradigms would be like comparing Euclidian and non-Euclidian geometry to see which is more “true,” though both are equally true to their own axioms (while mutually incompatible). Yet such terms in science do not merely define, but denote phenomena in our experience. Thus (to continue the example) while Euclidian and non-Euclidian geometries may both be equally valid according to their premises, they may not be equally valid according to how they describe our experience.

Kuhn’s response to this would be, I believe, that we cannot have neutral experiences, but all our observations are already theory-laden. While this is true, it is also true that theory does not totally determine our vision; and clever experimenters can often, I believe, devise tests that can differentiate between paradigms to most practitioners’ satisfaction. Nevertheless, as both Kuhn and Popper would admit, the decision to abandon one theory for another can never be a wholly rational affair, since there is no way of telling whether the old paradigm could, with sufficient ingenuity, be made to accommodate the anomalous data; and in any case a strange phenomena can always be tabled as a perplexing but unimportant deviation for future researchers to tackle. This is how an Aristotelian would view Galileo’s pendulum, I believe.

Yet this fact—that there can be no objective, fool-proof criteria for switching paradigms—is no reason to despair. We are not prophets; every decision we take involves risk that it will not pan out; and in this respect science is no different. What makes science special is not that it is purely rational or wholly objective, but that our guesses are systematically checked against our experience and debated within a community of dedicated inquirers. All knowledge contains an imaginative and thus an arbitrary element; but this does not mean that anything goes. To use a comparison, a painter working on a portrait will have to make innumerable little decisions during her work; and yet—provided the painter is working within a tradition that values literal realism—her work will be judged, not for the taste displayed, but for the perceived accuracy. Just so, science is not different from other cultural realms in lacking arbitrary elements, but in the shared values that determine how the final result is judged.

I think that Kuhn would assent to this; and I think it was only the widespread belief that science was as objective, asocial, and unimaginative as a camera taking a photograph that led him to emphasize the social and arbitrary aspects of science so strongly. This is why, contrary to his expectations, so many people read his work as advocating total relativism.

It should be said, however, that Kuhn’s position does alter how we normally think of “truth.” In this I also find him strikingly close to his reputed nemesis, Popper. For here is the Austrian philosopher on the quest for truth:

Science never pursues the illusory aim of making its answers final, or even probable. Its advance is, rather, towards the infinite yet attainable aim of ever discovering new, deeper, and more general problems, and of subjecting its ever tentative answers to ever renewed and ever more rigorous tests.

And here is what his American counterpart has to say:

Later scientific theories are better than earlier ones for solving puzzles in the often quite different environments to which they are applied. That is not a relativist’s position, and it displays the sense in which I am a convinced believer in scientific progress.

Here is another juxtaposition. Popper says:

Science is not a system of certain, or well-established, statements; nor is it a system which steadily advances towards a state of finality. Our science is not knowledge (episteme): it can never claim to have attained truth, or even a substitute for it, such as probability. … We do not know: we can only guess. And our guesses are guided by the unscientific, the metaphysical (though biologically explicable) faith in laws, in regularities which we can uncover—discover.

And Kuhn:

One often hears that successive theories grow ever closer to, or approximate more and more closely to, the truth… Perhaps there is some other way of salvaging the notion of ‘truth’ for application to whole theories, but this one will not do. There is, I think, no theory-independent way to reconstruct phrases like ‘really there’; the notion of a match between the ontology of a theory and its ‘real’ counterpart in nature now seems to me illusive in principle.

Though there are important differences, to me it is striking how similar their accounts of scientific progress are: the ever-increasing expansion of problems, or puzzles, that the scientist may investigate. And both thinkers are careful to point out that this expansion cannot be understood as an approach towards an ultimate “true” explanation of everything, and I think their reasons for saying so are related. For since Popper begins with theories, and Kuhn with paradigms—both of which stem from the imagination of scientists—their accounts of knowledge can never be wholly “objective,” but must contain an aforementioned arbitrary element. This necessarily leaves open the possibility that an incompatible theory may yet do an equal or better job in making sense of an observation, or that a heretofore undiscovered phenomenon may violate the theory. And this being so, we can never say that we have reached an “ultimate” explanation, where our theory can be taken as a perfect mirror of reality.

I do not think this notion jeopardizes the scientific enterprise. To the contrary, I think that science is distinguished from older, metaphysical sorts of enquiry in that it is always open-ended, and makes no claim to possessing absolute “truth.” It is this very corrigibility of science that is its strength.

This review has already gone on for far too long, and much of it has been spent riding my own hobby-horse without evaluating the book. Yet I think it is a testament to Kuhn’s work that it is still so rich and suggestive, even after many of its insights have been absorbed into the culture. Though I have tried to defend Kuhn from accusations of relativism or undermining science, anyone must admit that this book has many flaws. One is Kuhn’s firm line between “normal” science and paradigm shifts. In his model, the first consists of mere puzzle-solving while the second involves a radical break with the past. But I think experience does not bear out this hard dichotomy; discoveries and innovations may be revolutionary to different degrees, which I think undermines Kuhn’s picture of science evolving as a punctuated equilibrium.

Another weakness of Kuhn’s work is that it does not do justice to the way that empirical discoveries may cause unanticipated theoretical revolutions. In his model, major theoretical innovations are the products of brilliant practitioners who see the field in a new way. But this does not accurately describe what happened when, say, DNA was discovered. Watson and Crick worked within the known chemical paradigm, and operated like proper Popperians in brainstorming and eliminating possibilities based on the evidence. And yet the discovery of DNA’s double helix, while not overturning any major theoretical paradigms, nevertheless had such far-reaching implications that it caused a revolution in the field. Kuhn has little to say about events like this, which shows that his model is overly simplistic.

I must end here, after thrashing about ineffectually in multiple disciples in which I am not even the rankest amateur. What I hoped to re-capture in this review was the intellectual excitement I felt while reading this little volume. In somewhat dry (though not technical) academic prose, Kuhn caused a revolution still forceful enough to make me dizzy.

View all my reviews

Review: Vanity Fair

Review: Vanity Fair

Vanity FairVanity Fair by William Makepeace Thackeray

My rating: 4 of 5 stars

The world is a looking glass, and gives back to every man the reflection of his own face. Frown at it, and it will in turn look sourly upon you; laugh at it and with it, and it is a jolly kind companion; and so let all young persons take their choice.

There seems to be little to say about Vanity Fair that is worth the time in saying it. This is an open book; its appeal is direct, its themes obvious, its interpretation unambiguous. It is an extended satire of Victorian England—what more is there to add?

I was prepared for the nineteenth-century prose; indeed, Thackeray’s unadorned style has aged uncommonly well. I had readied myself for its protracted length and copious cast of characters. I was even prepared for the strong authorial voice and frequent asides; in this, Thackeray follows Henry Fielding quite closely. But I was not quite ready for such a depressing novel. For the secret of Vanity Fair’s lasting success is not, I think, due merely to Thackeray’s execution—brilliant as it is—but owes itself far more to the novel’s triumphant immoralism.

Like many great novelist, Thackeray opens the book by introducing to us a pair of characters, Becky Sharp and Amelia Sedley, who are to be foils for each other. Amelia is simple and good, while Becky is calculating and wicked. Following the standard conventions, we should expect Amelia to emerge triumphant and Becky to be foiled. And yet Thackeray consistently and persistently flaunts this expectation. Instead, he throws his characters into a world full of cowards, egoists, hypocrites, dullards, drunkards, gluttons, dandies, and every other species of vice—in short, Vanity Fair—and shows us that, in such a world, virtue is a luxury few can afford.

Indeed, the frightening thing about this novel is that Thackeray gradually pulls us into sympathy with Becky Sharp. The daughter of a painter and a dancing master, she hoists herself up from the lowest to the highest ranks of society using only her wit. In the process, it becomes clear that she is a sociopath in the proper sense of the word—seeing others as mere instruments, unable to care for anyone but herself. And yet we feel—we are made to feel—that she is not morally lower than those around her (who also only care for money and status), only cleverer and more determined.

In a word, Thackeray’s thesis is that, in our depraved world—where people care only for vanities, and where unjust accidents such as birth determine the distribution of these goods—the only logical course of action is to be ruthless. Thackeray completes this impression by showing how commonly virtue leads to misery. Amelia’s virtue, though genuine, is consistently made to look foolish. Her dedication to her husband is rendered ridiculous by her husband’s unfaithfulness, her dedication to her son rendered absurd by her son’s unconcern with leaving the house, and so on. For my part I found it very difficult to like her, and more often found myself rooting for Becky.

William Dobbin is the only character who is allowed to appear really admirable. Yet his virtue, too, is for most of the story ignored and unrewarded. And when he finally obtains his goal—by which time he has grown bitter with waiting—this is arguably caused, not by his action, but by Becky Sharp, the only effectively active character in the book.

The final result of this has been to leave me with a feeling of emptiness. Thackeray’s portrayal of Vanity Fair is convincing enough to leave the reader with a numbing sense of cynicism, scarcely pierced by the novel’s few tender moments. Despite this, I must recommend the book highly. Thackeray has, in many ways, aged better than his chief rival, Dickens. His prose is leaner and sharper, his characters more realistic, and his ethos free of Dickens’ dripping sentimentality. This is satire raised to a sweeping view of human life—which does not make it any funnier.

View all my reviews

Review: The Merchant of Venice

Review: The Merchant of Venice

The Merchant of VeniceThe Merchant of Venice by William Shakespeare

My rating: 3 of 5 stars

I can easier teach twenty what were good to be done, than be one of the twenty to follow mine own teaching.

In my first review of this play I agonized over whether it was truly anti-Semitic or not. Now I am not unsure: this play is undoubtedly anti-Semitic. The plot is simply incoherent if Shylock is to be regarded as anything but a villain. Sympathetic as we may be to a man so mistreated, we cannot sympathize with someone so single-mindedly bent on material gain and bloody vengeance. No playgoer can conscientiously hope, in the trial scene, that Shylock is successful in fulfilling his bond. And Shakespeare does not allow us to suspect that Shylock is bluffing: he is prepared to cut out a man’s flesh and weigh it on a scale (a traditional anti-Semitic image) simply because “it is my humour.” If Shakespeare was trying to be slyly subversive, he did a very poor job.

What provokes audiences into sympathy with Shylock is the end of the trial, in which, aside from being denied his money, he is forcibly converted to Christianity, on pain of death. To us this seems such an obvious mockery of justice, such an undeniable outrage, that we assume Shakespeare must have felt the same way, and to have written the scene to undermine all the Christian talk of mercy. Yet I do not think Shylock’s fate would have provoked anything like this reaction in Shakespeare’s England, where anti-Semitism was taken for granted. To the contrary, that such a greedy and bloodthirsty Jew should be spared some of his fortune and accepted into Christianity might have been seen as wholly just, even merciful.

The final result of this—Shylock’s villainy and the play’s anti-Semitism—made the trial scene literally sickening for me. One man, mistreated and spiteful, is trying to legally kill another man for defaulting on a debt, and he is in turn stripped of his property, his identity, and his honor—humiliated, kicked, and spat upon. And all this is delivered as the denouement of a romantic farce, complete with cross-dressing ladies and a playful love story. I admit that I was in no mood to overlook or excuse the anti-Semitism, having recently stood in the Ghetto Vecchio in Venice, and seen the monuments to the deported Jews there. Even so, I think anyone must admit that the play’s dramatic coherence is seriously compromised, even destroyed, by the decline of anti-Semitism.

It speaks to the power of Shakespeare’s art that, even in such an obviously anti-Semitic play, which uses so shamelessly anti-Jewish stereotypes, and which so joyfully persecutes the play’s Jewish villain—even despite all this, we still read and stage this play. As often happens in life, charisma can deaden our moral senses; and Shylock is nothing if not charismatic. He is one of dozens of Shakespeare’s characters whose dialogue reveals a complete personality, a shifting mind whose depths we can only guess at, whose roving interior life extends into parts unknown. Somehow Shakespeare has conjured a character that embodies all of the negative Jewish stereotypes, yet who nevertheless is a believable and fully individual human. This is dramatically admirable and, in retrospect, morally reprehensible. For, as Harold Bloom said, Shylock’s very plausibility is why the play has been such a potent inspiration for anti-Semites.

I am not sure what conclusion to draw from all this. The play is without doubt one of Shakespeare’s stronger efforts. And yet, by the end, I felt little more then distress.

View all my reviews