You are using an outdated browser.
Please upgrade your browser
and improve your visit to our site.
Skip Navigation

The End of Nature v. Nurture?

Editor’s Note: This post is part of a series further exploring The Two Year Window,” my feature story on babies, the brain, and poverty that appears in the new issue of TNR. Click here to access all of the supplemental material.

The debate over nature versus nurture has frequently exploded into politics. Probably the most memorable instance of this, at least in recent history, was during the early 1990s, when conservative writers Charles Murray and the late Richard Herrnstein published The Bell Curve.

The Bell Curve made a number of explosive claims, about the supposed significance of IQ and relationship between intelligence and race. But, at its core, The Bell Curve was an argument for the “nature” side of the debate. Roughly speaking, Murray and Herrnstein claimed that genetic differences were the primary reason for disparities in cognitive and intellectual abilities – disparities that led to inequality much later in life. And because these differences were genetically predetermined, the authors argued, attempting to redress them with public policy was not worthwhile.

The book was controversial from the get-go, thanks in no small part to an excerpt than ran in this magazine, giving it a certain intellectual legitimacy – although later this magazine also ran lengthy and fairly devastating critiques. But the real irony of the Bell Curve, and the attendant controversy, was that the authors’ claims were outdated even before they published them. By the 1990s, developmental psychologists generally agreed that both nature and nurture were important.

So, yes, genes do matter. But so does environment. In fact, the two interact.

For some time, the focus of research has been on exactly how they interact – and why. That knowledge has exploded in the last ten to fifteen years, thanks in part to the neuroscience developments I chronicle in my article.

Think of your DNA as a blueprint: From the day you are born – actually, from the day you are conceived – your cells will replicate themselves, growing into different body parts, and then function based on what the blueprint says. But environmental factors will play a huge role in determining how cells read that blueprint. Your mother’s prenatal diet will matter. So will your interactions with other people, particularly very early in your life.

The first understanding of this interaction came from animal studies, particularly studies on rats, which demonstrated that experience early in life had long-standing physiological effects. More recently researchers have seen the same changes in humans. A potentially big step forward came this summer, thanks to the paper by Stacy Drury, of Tulane, that I mention early in the article.

The paper grows out of the Bucharest Early Intervention Project, the controlled study that compares development among different groups of children, some of whom spent more time than others in the infamous Romanian orphanages. Drury and her team of researchers found that telomeres, the protective caps at the ends of chromosomes, were shorter in children who had spent more time in the orphanages – making them what Drury calls “a biological marker” of adversity.

Jack Shonkoff, a Harvard prediatrician and director of the National Council on the Developing Child at Harvard University, uses another term. He describes these physiological changes as part of a “subconscious biological memory” and says knowledge of them represents a major breakthrough: “Now we have a scientific understanding of the biology of adversity – how these bad experiences get into your body, under your skin, and into your brain.”

The same research has linked adversity with physical problems. A few months ago, the New Yorker had a terrific article by Paul Tough called “The Poverty Clinic.” It drew heavily on work by researchers at the University of California, Berkeley, using a data set culled from the patients of the Kaiser Permanente health care system. That study showed correlations between various sorts of abuse, neglect, and trouble in youth and such future health problems as cardiovascular disease.

The strong role of environment in development is the ultimate rejoinder to Herrnstein, Murray, and everybody else who argues public policy can’t make a difference. The science, Shonkoff says, “clearly illustrates that this is not inevitable” – and that it's possible interventions can work, particularly if they take place early.

In the article, I talk about some of those interventions -- and one in particular that has shown a lot of promise. More on that later. 

Update: I've reworded this item (in response to a letter from a friend) to make clear that genetics still play an important role in development. The question is whether nature, as opposed to nurture, is as dominant as Murray and Herrnstein suggested. The consensus among scientists that I interviewed was that it is not -- i.e., that Murray and Herrnstein significantly understated the influence of environment.

Correction Added: The original version of this piece misidentified Jack Shonkoff--he works at the National Council on the Developing Child at Harvard University. The piece has been changed to reflect that fact.