You are using an outdated browser.
Please upgrade your browser
and improve your visit to our site.

Did Math Kill God?

A new book on Renaissance mathematics makes a bold case.

British Library

Once upon a time, a great Italian published a work called the Siderius Nuncius. Galileo had seen the moons of Jupiter through his telescope. He had seen Venus moving. So, in 1606, he endorsed the ideas Copernicus had written down a half-century earlier in the De revolutionibus orbium coelestium. The earth moves around the sun, Galileo said. On February 24, 1616, the Qualifiers of the Inquisition declared heliocentrism heretical. After a trial Galileo was sentenced to house arrest in 1633. There he stayed ever after, moving indeed around a sun, but stuck indoors while thinking about it.

You have likely heard this tale many times, but what does the story mean? For children it shows that you should stick to your guns when you know you’re right—especially if you’re a scientist; for adults it represents a key moment in the development of astronomy and the sciences in general. And those two lessons bind into a bigger story that we use to define who we are, in our time. The Galileo Affair becomes part of a metanarrative, or, in Jean-Francois Lyotard’s term, a Grand Narrative. It says that early seventeenth-century Europe hung at a crux, with religion pulling it backward into medieval ignorance and science straining to push time forward into modernity. Against the benighted church Galileo labored, alongside the other great thinkers of the sixteenth century who gave rise to our rational modern age.

As many scholars have observed in the past century or so, this is a deeply suspicious way to think about events from the past. When Lyotard coined his term, he was observing the tendency to conceive of knowledge in the form of story-telling. Such narratives legitimize certain ideas, he argued. The story of Jesus is not merely a biography, but legitimizes Christianity as a social norm. In the same way, history and science are driven by narrative. When we learn about physics in class, we do not ourselves perform every experiment. We hear the narrative about forces exerting themselves in an equal and opposite direction, and believe it. When those narratives culminate in a metanarrative, we see a result that seems almost pre-determined. We see things as pursuing a teleology.

THE GREAT RIFT: LITERACY, NUMERACY, AND THE RELIGION-SCIENCE DIVIDE, by Michael E. Hobart.
Harvard University Press, 520 pp., $39.95.

In a new book called The Great Rift: Literacy, Numeracy, and the Religion-Science Divide, Michael E. Hobart offers a new twist on a huge old metanarrative: the death of God. Something or other happened in Renaissance Europe, the story goes, and it eventually distanced scientists from religion. Hobart locates this great shift in the field of mathematics. Other historians have given credit to experimenters who pioneered the scientific method, or astronomers like Galileo or Kepler, but Hobart claims that Renaissance mathematics is distinct from its medieval predecessor because it reconceived numeracy as a tool for describing the quantities of things into an abstract system for describing relations between them. Scholars began thinking “with empty and abstract information symbols,” which catalyzed a revolution from “thing-mathematics” to “relation-mathematics.” Because this form of knowledge went beyond ordinary language, which previously was the primary means of conveying information, people slowly began to conceive of a world contingent on “natural” laws rather than the word of God.

To make this argument, Hobart presents a virtuosic array of evidence. He opens by laying out the context of the religion-versus-science debate, pointing to various thinkers’ conflicting accounts of whether the two fields conflict. He cites Steven Jay Gould’s theory of non-overlapping magisteria (NOMA)—the idea that science and religion belong properly to two different domains of knowledge and thus can remain separate—and Huston Smith’s assertion that science is subservient to God. Hobart does not come down on one side, but rather shows that the debate is not over, and that he has a new interpretation of the problem to offer.

Turning to history proper, Hobart then attempts a very broad survey indeed, mapping the scientific trends of medieval Europe. Though he ranges from Aristotle to Aquinas—and many points in-between—his core point is that alphabetic literacy was the period’s “underlying informational technology,” and that this technology influenced and framed a “presumptive harmony of science and religion.” Moving into the Renaissance, he argues, we see new forms of information technology appear. Texts by Euclid, Archimedes, Apollonius, and more were rediscovered in the sixteenth century, while Hindu-Arabic numerals replaced the Roman system. I am an appalling mathematician, but I just about understood Hobart’s discussion of the new Renaissance geometry: its interest in reverse engineering, its systems of musical notation to describe the magic of harmony. His analysis all leads up to Galileo’s exploitation of the new information technology of relational mathematics, “especially in his analyses of free fall, pendulums, and projectiles, and in his efforts to mathematize matter.”

The Great Rift contains a huge wealth of historical anecdote and Hobart marshals it confidently, but tends to wobble when he makes grand claims. You can tell it’s happening because the passive voice bobs up like a bad apple.

Of the sixteenth century, Hobart writes that mathematical analysis, with its “abstract and functional thinking about natural processes,” would ultimately rid science of any lingering religious beliefs. As a result, “a new mindset, the analytical temper, was in the making.” Really? Large claims about the interiority of human beings from a different time are not so easily proven. Hobart makes them in lumpen sweeps. Writing on reverse engineering, in which deduction is used to reveal an object’s nature, he says, “Suffice for now that by the early 1600s, a new sort of mental activity was flowering: algebra, geometry, and physics were being revolutionized.” Concluding a piece: “As we observed at the outset of this extended essay, the medieval harmony has been abandoned, replaced by a rift between science and religion.”

In each of these uses of the passive voice—that something was being done, that something had been abandoned—there is a gap. In each scientific analysis, Hobart goes from step to step judiciously. But each time he returns to the arc of his book-length argument, he rests passively on the extant metanarrative.

Information technology and its effect on the way we think and feel is a crucial issue in our time. In his 1979 book The Postmodern Condition: A Report on Knowledge, Lyotard wrote, “It is conceivable that the nation-states will one day fight for control of information, just as they battled in the past for control over territory, and afterwards for control of access to and exploitation of raw materials and cheap labor.” This has come true. The information technology at a certain historical moment must determine, to some extent, the way that we communicate with each other.

But to claim that information technology controls the way an individual’s mind works edges into technological determinism, the Marx-adjacent theory that a society’s tools determine its culture. This theory holds that the printing press itself facilitated the intellectual culture that used it. Enslaved people in America were partly enslaved by the cotton gin as well as by other people. And in our age of hyper-communication, so obviously determined and controlled and facilitated by private corporations (which also meddle in statecraft), the kind of explanation that Hobart offers in The Great Rift will find many willing readers.

Although it is impossible to fault the passion with which he has ventured into historical mathematics’ every daunting nook and baffling cranny, the speed and the sweep of Hobart’s argument makes it hard for a reader suspicious of metanarrative to remain unsuspicious of his book. Technological determinism is perhaps the great intellectual temptation of our decade—try not to fall for it.